Oct 07 13:53:36 crc systemd[1]: Starting Kubernetes Kubelet... Oct 07 13:53:37 crc restorecon[4664]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:37 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:38 crc restorecon[4664]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 07 13:53:38 crc restorecon[4664]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 07 13:53:38 crc kubenswrapper[4717]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:53:38 crc kubenswrapper[4717]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 07 13:53:38 crc kubenswrapper[4717]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:53:38 crc kubenswrapper[4717]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:53:38 crc kubenswrapper[4717]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 07 13:53:38 crc kubenswrapper[4717]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.598494 4717 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605299 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605325 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605334 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605341 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605348 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605355 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605360 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605366 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605372 4717 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605377 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605382 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605387 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605400 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605406 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605412 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605417 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605422 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605428 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605434 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605440 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605445 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605450 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605455 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605460 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605466 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605471 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605477 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605482 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605488 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605495 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605501 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605508 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605514 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605520 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605527 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605532 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605537 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605543 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605548 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605553 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605559 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605564 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605569 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605576 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605584 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605591 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605597 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605603 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605610 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605616 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605621 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605628 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605633 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605639 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605645 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605652 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605657 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605664 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605669 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605675 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605681 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605687 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605692 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605700 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605705 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605712 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605717 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605722 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605727 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605734 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.605741 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606580 4717 flags.go:64] FLAG: --address="0.0.0.0" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606597 4717 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606608 4717 flags.go:64] FLAG: --anonymous-auth="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606615 4717 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606623 4717 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606630 4717 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606637 4717 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606645 4717 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606652 4717 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606658 4717 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606665 4717 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606671 4717 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606677 4717 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606683 4717 flags.go:64] FLAG: --cgroup-root="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606702 4717 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606709 4717 flags.go:64] FLAG: --client-ca-file="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606717 4717 flags.go:64] FLAG: --cloud-config="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606724 4717 flags.go:64] FLAG: --cloud-provider="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606731 4717 flags.go:64] FLAG: --cluster-dns="[]" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606739 4717 flags.go:64] FLAG: --cluster-domain="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606745 4717 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606751 4717 flags.go:64] FLAG: --config-dir="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606757 4717 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606764 4717 flags.go:64] FLAG: --container-log-max-files="5" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606773 4717 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606780 4717 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606786 4717 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606792 4717 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606799 4717 flags.go:64] FLAG: --contention-profiling="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606804 4717 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606811 4717 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606817 4717 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606823 4717 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606830 4717 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606837 4717 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606843 4717 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606849 4717 flags.go:64] FLAG: --enable-load-reader="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606855 4717 flags.go:64] FLAG: --enable-server="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606861 4717 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606869 4717 flags.go:64] FLAG: --event-burst="100" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606877 4717 flags.go:64] FLAG: --event-qps="50" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606884 4717 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606889 4717 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606895 4717 flags.go:64] FLAG: --eviction-hard="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606903 4717 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606909 4717 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606915 4717 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606921 4717 flags.go:64] FLAG: --eviction-soft="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606927 4717 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606933 4717 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606939 4717 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606945 4717 flags.go:64] FLAG: --experimental-mounter-path="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606951 4717 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606958 4717 flags.go:64] FLAG: --fail-swap-on="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606965 4717 flags.go:64] FLAG: --feature-gates="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606973 4717 flags.go:64] FLAG: --file-check-frequency="20s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606982 4717 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606989 4717 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.606997 4717 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607021 4717 flags.go:64] FLAG: --healthz-port="10248" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607027 4717 flags.go:64] FLAG: --help="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607033 4717 flags.go:64] FLAG: --hostname-override="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607039 4717 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607046 4717 flags.go:64] FLAG: --http-check-frequency="20s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607052 4717 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607058 4717 flags.go:64] FLAG: --image-credential-provider-config="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607064 4717 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607070 4717 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607076 4717 flags.go:64] FLAG: --image-service-endpoint="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607082 4717 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607088 4717 flags.go:64] FLAG: --kube-api-burst="100" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607095 4717 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607101 4717 flags.go:64] FLAG: --kube-api-qps="50" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607107 4717 flags.go:64] FLAG: --kube-reserved="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607113 4717 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607119 4717 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607125 4717 flags.go:64] FLAG: --kubelet-cgroups="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607131 4717 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607137 4717 flags.go:64] FLAG: --lock-file="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607143 4717 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607149 4717 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607155 4717 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607165 4717 flags.go:64] FLAG: --log-json-split-stream="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607171 4717 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607177 4717 flags.go:64] FLAG: --log-text-split-stream="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607183 4717 flags.go:64] FLAG: --logging-format="text" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607189 4717 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607196 4717 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607203 4717 flags.go:64] FLAG: --manifest-url="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607209 4717 flags.go:64] FLAG: --manifest-url-header="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607218 4717 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607225 4717 flags.go:64] FLAG: --max-open-files="1000000" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607232 4717 flags.go:64] FLAG: --max-pods="110" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607239 4717 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607245 4717 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607252 4717 flags.go:64] FLAG: --memory-manager-policy="None" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607258 4717 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607265 4717 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607271 4717 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607278 4717 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607292 4717 flags.go:64] FLAG: --node-status-max-images="50" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607298 4717 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607305 4717 flags.go:64] FLAG: --oom-score-adj="-999" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607311 4717 flags.go:64] FLAG: --pod-cidr="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607317 4717 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607328 4717 flags.go:64] FLAG: --pod-manifest-path="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607334 4717 flags.go:64] FLAG: --pod-max-pids="-1" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607341 4717 flags.go:64] FLAG: --pods-per-core="0" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607347 4717 flags.go:64] FLAG: --port="10250" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607353 4717 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607359 4717 flags.go:64] FLAG: --provider-id="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607365 4717 flags.go:64] FLAG: --qos-reserved="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607371 4717 flags.go:64] FLAG: --read-only-port="10255" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607377 4717 flags.go:64] FLAG: --register-node="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607383 4717 flags.go:64] FLAG: --register-schedulable="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607389 4717 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607399 4717 flags.go:64] FLAG: --registry-burst="10" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607405 4717 flags.go:64] FLAG: --registry-qps="5" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607412 4717 flags.go:64] FLAG: --reserved-cpus="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607419 4717 flags.go:64] FLAG: --reserved-memory="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607426 4717 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607433 4717 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607439 4717 flags.go:64] FLAG: --rotate-certificates="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607446 4717 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607452 4717 flags.go:64] FLAG: --runonce="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607459 4717 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607466 4717 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607473 4717 flags.go:64] FLAG: --seccomp-default="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607480 4717 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607487 4717 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607494 4717 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607501 4717 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607507 4717 flags.go:64] FLAG: --storage-driver-password="root" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607514 4717 flags.go:64] FLAG: --storage-driver-secure="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607521 4717 flags.go:64] FLAG: --storage-driver-table="stats" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607527 4717 flags.go:64] FLAG: --storage-driver-user="root" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607533 4717 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607539 4717 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607545 4717 flags.go:64] FLAG: --system-cgroups="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607551 4717 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607561 4717 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607567 4717 flags.go:64] FLAG: --tls-cert-file="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607578 4717 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607585 4717 flags.go:64] FLAG: --tls-min-version="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607592 4717 flags.go:64] FLAG: --tls-private-key-file="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607598 4717 flags.go:64] FLAG: --topology-manager-policy="none" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607605 4717 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607611 4717 flags.go:64] FLAG: --topology-manager-scope="container" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607617 4717 flags.go:64] FLAG: --v="2" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607626 4717 flags.go:64] FLAG: --version="false" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607634 4717 flags.go:64] FLAG: --vmodule="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607641 4717 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.607648 4717 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607820 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607828 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607835 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607841 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607847 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607853 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607858 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607864 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607869 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607875 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607880 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607885 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607890 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607895 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607900 4717 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607905 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607911 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607917 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607923 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607928 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607934 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607939 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607945 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607950 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607955 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607961 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607966 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607971 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607977 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607982 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607988 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.607995 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608001 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608028 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608043 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608049 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608055 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608063 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608070 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608077 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608083 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608089 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608095 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608100 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608106 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608112 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608118 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608123 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608129 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608135 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608141 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608148 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608154 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608159 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608165 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608171 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608176 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608181 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608187 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608193 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608198 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608204 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608210 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608215 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608221 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608226 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608232 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608237 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608243 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608248 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.608254 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.609340 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.621827 4717 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.621878 4717 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.621972 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.621982 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.621988 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.621993 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.621999 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622023 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622027 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622032 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622037 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622041 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622046 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622050 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622054 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622058 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622062 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622065 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622069 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622073 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622077 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622080 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622084 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622087 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622092 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622098 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622102 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622108 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622115 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622119 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622125 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622130 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622135 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622139 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622143 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622147 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622151 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622155 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622159 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622163 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622168 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622177 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622182 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622186 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622190 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622194 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622199 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622203 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622207 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622211 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622215 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622219 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622222 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622226 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622231 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622237 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622240 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622245 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622250 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622254 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622259 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622263 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622269 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622272 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622277 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622281 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622286 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622290 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622295 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622299 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622303 4717 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622307 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622312 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.622321 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622460 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622470 4717 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622474 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622478 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622483 4717 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622488 4717 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622491 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622495 4717 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622499 4717 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622503 4717 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622506 4717 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622511 4717 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622516 4717 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622520 4717 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622525 4717 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622529 4717 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622533 4717 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622538 4717 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622542 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622547 4717 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622550 4717 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622554 4717 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622557 4717 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622562 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622565 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622570 4717 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622573 4717 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622578 4717 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622582 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622586 4717 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622591 4717 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622595 4717 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622599 4717 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622604 4717 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622608 4717 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622611 4717 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622615 4717 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622619 4717 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622623 4717 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622627 4717 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622631 4717 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622634 4717 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622638 4717 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622642 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622668 4717 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622673 4717 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622677 4717 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622681 4717 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622685 4717 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622689 4717 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622692 4717 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622696 4717 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622699 4717 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622702 4717 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622706 4717 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622710 4717 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622713 4717 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622717 4717 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622720 4717 feature_gate.go:330] unrecognized feature gate: Example Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622724 4717 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622727 4717 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622730 4717 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622734 4717 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622737 4717 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622741 4717 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622744 4717 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622748 4717 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622751 4717 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622755 4717 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622760 4717 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.622765 4717 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.622773 4717 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.623985 4717 server.go:940] "Client rotation is on, will bootstrap in background" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.628202 4717 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.628317 4717 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.629949 4717 server.go:997] "Starting client certificate rotation" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.629980 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.631249 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-06 19:39:20.752648044 +0000 UTC Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.631342 4717 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1445h45m42.121307949s for next certificate rotation Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.665571 4717 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.669647 4717 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.690371 4717 log.go:25] "Validated CRI v1 runtime API" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.734265 4717 log.go:25] "Validated CRI v1 image API" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.737797 4717 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.744921 4717 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-07-13-49-14-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.744970 4717 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.769695 4717 manager.go:217] Machine: {Timestamp:2025-10-07 13:53:38.766403885 +0000 UTC m=+0.594329717 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:1df5aeeb-e72c-4f95-84ca-4d7d0e672fce BootID:181bd669-3920-47f2-a947-b62e480db854 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ed:35:e6 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ed:35:e6 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:e8:b8:cd Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:67:9b:5b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:74:85:a9 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:90:4d:04 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7a:ab:83:74:76:48 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:f2:67:1d:43:98:c0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.770093 4717 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.770362 4717 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.770756 4717 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.771058 4717 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.771121 4717 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.771413 4717 topology_manager.go:138] "Creating topology manager with none policy" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.771430 4717 container_manager_linux.go:303] "Creating device plugin manager" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.771966 4717 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.772035 4717 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.772888 4717 state_mem.go:36] "Initialized new in-memory state store" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.773042 4717 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.777380 4717 kubelet.go:418] "Attempting to sync node with API server" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.777413 4717 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.777446 4717 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.777469 4717 kubelet.go:324] "Adding apiserver pod source" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.777485 4717 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.782029 4717 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.783343 4717 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.785367 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.785786 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.785365 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.786874 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.787048 4717 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.789110 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.789322 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.789475 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.789616 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.789768 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.789932 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.790122 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.790326 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.790470 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.790625 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.790779 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.790926 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.792317 4717 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.794130 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.794313 4717 server.go:1280] "Started kubelet" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.795104 4717 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.795075 4717 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.796199 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.796301 4717 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.796378 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 18:56:06.712123077 +0000 UTC Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.796436 4717 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 773h2m27.915691446s for next certificate rotation Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.796449 4717 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.796482 4717 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.796644 4717 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.796611 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.796876 4717 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 07 13:53:38 crc systemd[1]: Started Kubernetes Kubelet. Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.798603 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.148:6443: connect: connection refused" interval="200ms" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.798835 4717 factory.go:55] Registering systemd factory Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.799033 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.799115 4717 factory.go:221] Registration of the systemd container factory successfully Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.799235 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.799800 4717 factory.go:153] Registering CRI-O factory Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.799822 4717 factory.go:221] Registration of the crio container factory successfully Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.799923 4717 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.804318 4717 factory.go:103] Registering Raw factory Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.804392 4717 manager.go:1196] Started watching for new ooms in manager Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.805791 4717 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.148:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186c39e705d2891d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-07 13:53:38.792933661 +0000 UTC m=+0.620859443,LastTimestamp:2025-10-07 13:53:38.792933661 +0000 UTC m=+0.620859443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.808630 4717 manager.go:319] Starting recovery of all containers Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.812353 4717 server.go:460] "Adding debug handlers to kubelet server" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.817971 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818110 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818139 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818160 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818212 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818232 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818251 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818270 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818292 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818315 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818334 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818356 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818375 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818398 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818417 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818438 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818456 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818475 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818494 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818515 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818534 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818553 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818581 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818602 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818623 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818646 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818672 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818696 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818717 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818737 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818756 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818777 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818797 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818817 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818836 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818856 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818876 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818898 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818917 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818939 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818959 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.818982 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819003 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819053 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819075 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819095 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819116 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819136 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819158 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819181 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819202 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819223 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819251 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819273 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819295 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819317 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819340 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819363 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819384 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819403 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819429 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819449 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819469 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819491 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819510 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819532 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819554 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819574 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819593 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819612 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819632 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819652 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819672 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819696 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819719 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819739 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819760 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819781 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819802 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819822 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819843 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819863 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819883 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819903 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819922 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819947 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819968 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.819990 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820035 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820059 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820079 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820100 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820119 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820138 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820159 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820180 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820201 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820221 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820241 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820264 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820283 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820303 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820322 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820343 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820371 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820394 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820416 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820437 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820457 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820478 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820501 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820523 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820546 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820566 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820588 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820608 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820627 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820646 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820666 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820686 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820706 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820727 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820749 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820772 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820791 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820811 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820832 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820852 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820871 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820891 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820910 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820930 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820950 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820971 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.820990 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821040 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821063 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821094 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821115 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821135 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821156 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821175 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821195 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821215 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821234 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821265 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821284 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821305 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821324 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821344 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821365 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821386 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821407 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821430 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821449 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821470 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821488 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821507 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821529 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.821547 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.823774 4717 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.823828 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.823853 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.823875 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.823898 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.823917 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.823939 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.823962 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.823982 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824003 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824052 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824074 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824094 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824115 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824137 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824159 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824180 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824199 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824220 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824242 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824262 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824283 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824304 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824326 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824347 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824369 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824390 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824409 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824451 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824472 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824493 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824512 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824531 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824554 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824573 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824595 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824616 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824635 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824658 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824681 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824702 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824724 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824745 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824768 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824789 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824810 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824830 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824849 4717 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824868 4717 reconstruct.go:97] "Volume reconstruction finished" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.824881 4717 reconciler.go:26] "Reconciler: start to sync state" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.833721 4717 manager.go:324] Recovery completed Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.849242 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.853577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.853635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.853653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.856457 4717 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.856528 4717 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.856569 4717 state_mem.go:36] "Initialized new in-memory state store" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.864039 4717 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.866995 4717 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.867078 4717 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.867117 4717 kubelet.go:2335] "Starting kubelet main sync loop" Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.867189 4717 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 07 13:53:38 crc kubenswrapper[4717]: W1007 13:53:38.868578 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.868732 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.880107 4717 policy_none.go:49] "None policy: Start" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.880977 4717 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.881040 4717 state_mem.go:35] "Initializing new in-memory state store" Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.897336 4717 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.942263 4717 manager.go:334] "Starting Device Plugin manager" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.942353 4717 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.942375 4717 server.go:79] "Starting device plugin registration server" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.943187 4717 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.943217 4717 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.943806 4717 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.943945 4717 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.943970 4717 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.953043 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.967879 4717 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.967973 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.970333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.970391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.970406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.970586 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.970757 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.970793 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.972118 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.972152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.972166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.972128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.972246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.972258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.972363 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.972423 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.972466 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.973113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.973138 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.973151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.973350 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.973595 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.973668 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.973684 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.973709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.973721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.974445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.974495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.974508 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.974715 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.974845 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.974891 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.975138 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.975169 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.975180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.975704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.975726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.975738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.975929 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.975962 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.976057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.976091 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.976104 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.976575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.976605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:38 crc kubenswrapper[4717]: I1007 13:53:38.976618 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:38 crc kubenswrapper[4717]: E1007 13:53:38.999631 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.148:6443: connect: connection refused" interval="400ms" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027219 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027262 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027330 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027381 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027415 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027441 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027496 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027550 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027584 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027616 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027656 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027691 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027726 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.027759 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.045466 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.047107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.047157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.047172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.047207 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:53:39 crc kubenswrapper[4717]: E1007 13:53:39.047790 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.148:6443: connect: connection refused" node="crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129321 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129389 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129425 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129484 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129522 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129555 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129589 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129635 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129668 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129682 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129695 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129853 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129868 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129927 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129824 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129773 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129937 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129961 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129986 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.129778 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.130000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.130111 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.130165 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.130194 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.130237 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.130251 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.130368 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.130378 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.130501 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.248933 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.251178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.251283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.251304 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.251378 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:53:39 crc kubenswrapper[4717]: E1007 13:53:39.252355 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.148:6443: connect: connection refused" node="crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.315301 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.323482 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.346327 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: W1007 13:53:39.368396 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-59ca54b721b52683004448ce8656a416169f4f4cc9261f520fa5cd01bc171fa2 WatchSource:0}: Error finding container 59ca54b721b52683004448ce8656a416169f4f4cc9261f520fa5cd01bc171fa2: Status 404 returned error can't find the container with id 59ca54b721b52683004448ce8656a416169f4f4cc9261f520fa5cd01bc171fa2 Oct 07 13:53:39 crc kubenswrapper[4717]: W1007 13:53:39.370576 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-9992f06f723994e71bba7bc990c2402ab5bd3f176e279ba2b96e354fe494bb7f WatchSource:0}: Error finding container 9992f06f723994e71bba7bc990c2402ab5bd3f176e279ba2b96e354fe494bb7f: Status 404 returned error can't find the container with id 9992f06f723994e71bba7bc990c2402ab5bd3f176e279ba2b96e354fe494bb7f Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.376554 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: W1007 13:53:39.378746 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-47955c6025ffa03da741f89fb36b3ea2aa20c354ccd127c6073089b7c5fe3cc3 WatchSource:0}: Error finding container 47955c6025ffa03da741f89fb36b3ea2aa20c354ccd127c6073089b7c5fe3cc3: Status 404 returned error can't find the container with id 47955c6025ffa03da741f89fb36b3ea2aa20c354ccd127c6073089b7c5fe3cc3 Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.381192 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:53:39 crc kubenswrapper[4717]: E1007 13:53:39.401271 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.148:6443: connect: connection refused" interval="800ms" Oct 07 13:53:39 crc kubenswrapper[4717]: W1007 13:53:39.407130 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-033bf2e269bf2052ff15c2e1542fd6f1ddf86a45151405809bca87193260f2b1 WatchSource:0}: Error finding container 033bf2e269bf2052ff15c2e1542fd6f1ddf86a45151405809bca87193260f2b1: Status 404 returned error can't find the container with id 033bf2e269bf2052ff15c2e1542fd6f1ddf86a45151405809bca87193260f2b1 Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.653421 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.655966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.656047 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.656072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.656118 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:53:39 crc kubenswrapper[4717]: E1007 13:53:39.656840 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.148:6443: connect: connection refused" node="crc" Oct 07 13:53:39 crc kubenswrapper[4717]: W1007 13:53:39.691971 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:39 crc kubenswrapper[4717]: E1007 13:53:39.692143 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:39 crc kubenswrapper[4717]: W1007 13:53:39.729642 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:39 crc kubenswrapper[4717]: E1007 13:53:39.729772 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.795235 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.875141 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0d07b97fa5a1f2f5fb660aecc8c2539854e491abf03f0319b5fca30b9f178ad7"} Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.877158 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"033bf2e269bf2052ff15c2e1542fd6f1ddf86a45151405809bca87193260f2b1"} Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.878165 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"47955c6025ffa03da741f89fb36b3ea2aa20c354ccd127c6073089b7c5fe3cc3"} Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.879500 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9992f06f723994e71bba7bc990c2402ab5bd3f176e279ba2b96e354fe494bb7f"} Oct 07 13:53:39 crc kubenswrapper[4717]: I1007 13:53:39.880615 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"59ca54b721b52683004448ce8656a416169f4f4cc9261f520fa5cd01bc171fa2"} Oct 07 13:53:39 crc kubenswrapper[4717]: W1007 13:53:39.963293 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:39 crc kubenswrapper[4717]: E1007 13:53:39.963395 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:40 crc kubenswrapper[4717]: E1007 13:53:40.202177 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.148:6443: connect: connection refused" interval="1.6s" Oct 07 13:53:40 crc kubenswrapper[4717]: W1007 13:53:40.362575 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:40 crc kubenswrapper[4717]: E1007 13:53:40.362702 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.456978 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.459103 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.459145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.459158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.459204 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:53:40 crc kubenswrapper[4717]: E1007 13:53:40.459873 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.148:6443: connect: connection refused" node="crc" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.795690 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.887317 4717 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414" exitCode=0 Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.887708 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414"} Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.887883 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.890304 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.890365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.890390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.891488 4717 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565" exitCode=0 Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.891586 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565"} Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.891670 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.893648 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.893714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.893733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.897642 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0"} Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.897717 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.897721 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089"} Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.897868 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2"} Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.897887 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32"} Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.898861 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.898902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.898917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.899530 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9" exitCode=0 Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.899603 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9"} Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.899653 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.900742 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.900788 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.900812 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.901617 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8" exitCode=0 Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.901667 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8"} Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.901760 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.902754 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.903541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.903669 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.903682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.903707 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.903723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:40 crc kubenswrapper[4717]: I1007 13:53:40.903690 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.795554 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:41 crc kubenswrapper[4717]: E1007 13:53:41.803947 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.148:6443: connect: connection refused" interval="3.2s" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.924870 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec" exitCode=0 Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.924959 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec"} Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.925140 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.926087 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.926115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.926126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.929360 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6eb23ce00314407d6673bf4c94e784c7bb5cf07582e2f51538884da28baa5c75"} Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.929516 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.930947 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.930986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.930999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.947289 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48"} Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.947363 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb"} Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.947376 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43"} Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.947327 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.948254 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.948288 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.948298 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.953550 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925"} Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.953592 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942"} Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.953606 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5"} Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.953616 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f"} Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.953621 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.954453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.954487 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:41 crc kubenswrapper[4717]: I1007 13:53:41.954510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.060909 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.062675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.062755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.062779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.062830 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:53:42 crc kubenswrapper[4717]: E1007 13:53:42.063667 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.148:6443: connect: connection refused" node="crc" Oct 07 13:53:42 crc kubenswrapper[4717]: W1007 13:53:42.282355 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:42 crc kubenswrapper[4717]: E1007 13:53:42.282461 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:42 crc kubenswrapper[4717]: W1007 13:53:42.492071 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:42 crc kubenswrapper[4717]: E1007 13:53:42.492197 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:42 crc kubenswrapper[4717]: W1007 13:53:42.576351 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.148:6443: connect: connection refused Oct 07 13:53:42 crc kubenswrapper[4717]: E1007 13:53:42.576520 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.148:6443: connect: connection refused" logger="UnhandledError" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.694823 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.702455 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.961597 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2"} Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.961705 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.962894 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.962952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.962965 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.964301 4717 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a" exitCode=0 Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.964416 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.964451 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.964461 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.964488 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a"} Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.964473 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.964473 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.965583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.965622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.965642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.966899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.966942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.966946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.967030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.967073 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.967089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.966968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.967157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:42 crc kubenswrapper[4717]: I1007 13:53:42.967043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.840588 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.975194 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282"} Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.975336 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.975334 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778"} Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.975497 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.975520 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00"} Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.975551 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f"} Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.975561 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9"} Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.975339 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.975631 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.975444 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.977343 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.977392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.977403 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.977423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.977431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.977461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.977481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.977440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:43 crc kubenswrapper[4717]: I1007 13:53:43.977425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:44 crc kubenswrapper[4717]: I1007 13:53:44.906680 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 07 13:53:44 crc kubenswrapper[4717]: I1007 13:53:44.978652 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:53:44 crc kubenswrapper[4717]: I1007 13:53:44.978736 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:44 crc kubenswrapper[4717]: I1007 13:53:44.978684 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:44 crc kubenswrapper[4717]: I1007 13:53:44.980207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:44 crc kubenswrapper[4717]: I1007 13:53:44.980286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:44 crc kubenswrapper[4717]: I1007 13:53:44.980305 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:44 crc kubenswrapper[4717]: I1007 13:53:44.980214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:44 crc kubenswrapper[4717]: I1007 13:53:44.980387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:44 crc kubenswrapper[4717]: I1007 13:53:44.980408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.264790 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.266202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.266259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.266278 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.266312 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.976263 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.981404 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.981436 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.982791 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.982829 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.982840 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.983262 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.983331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:45 crc kubenswrapper[4717]: I1007 13:53:45.983347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.405156 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.405449 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.406838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.406875 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.406884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.468811 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.469338 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.473364 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.473415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.473427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.969836 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.984438 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.985437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.985490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:46 crc kubenswrapper[4717]: I1007 13:53:46.985505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:48 crc kubenswrapper[4717]: E1007 13:53:48.953333 4717 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.068346 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.068620 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.070059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.070093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.070104 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.126738 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.126995 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.128626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.128674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.128686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.134636 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.993372 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.994682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.994733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:49 crc kubenswrapper[4717]: I1007 13:53:49.994746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:50 crc kubenswrapper[4717]: I1007 13:53:50.108918 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:53:50 crc kubenswrapper[4717]: I1007 13:53:50.995385 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:50 crc kubenswrapper[4717]: I1007 13:53:50.996545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:50 crc kubenswrapper[4717]: I1007 13:53:50.996579 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:50 crc kubenswrapper[4717]: I1007 13:53:50.996589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:52 crc kubenswrapper[4717]: I1007 13:53:52.795908 4717 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.002843 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.006153 4717 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2" exitCode=255 Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.006234 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2"} Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.006487 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.007996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.008058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.008072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.008751 4717 scope.go:117] "RemoveContainer" containerID="dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2" Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.109997 4717 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.110154 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 07 13:53:53 crc kubenswrapper[4717]: W1007 13:53:53.493547 4717 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.493662 4717 trace.go:236] Trace[1889554605]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:53:43.492) (total time: 10001ms): Oct 07 13:53:53 crc kubenswrapper[4717]: Trace[1889554605]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:53:53.493) Oct 07 13:53:53 crc kubenswrapper[4717]: Trace[1889554605]: [10.001508729s] [10.001508729s] END Oct 07 13:53:53 crc kubenswrapper[4717]: E1007 13:53:53.493688 4717 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.557659 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.557748 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.563160 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.563249 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.846477 4717 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]log ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]etcd ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/generic-apiserver-start-informers ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/priority-and-fairness-filter ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/start-apiextensions-informers ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/start-apiextensions-controllers ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/crd-informer-synced ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/start-system-namespaces-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 07 13:53:53 crc kubenswrapper[4717]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 07 13:53:53 crc kubenswrapper[4717]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/bootstrap-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/start-kube-aggregator-informers ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/apiservice-registration-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/apiservice-discovery-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]autoregister-completion ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/apiservice-openapi-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 07 13:53:53 crc kubenswrapper[4717]: livez check failed Oct 07 13:53:53 crc kubenswrapper[4717]: I1007 13:53:53.846567 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:53:54 crc kubenswrapper[4717]: I1007 13:53:54.012214 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 07 13:53:54 crc kubenswrapper[4717]: I1007 13:53:54.015021 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47"} Oct 07 13:53:54 crc kubenswrapper[4717]: I1007 13:53:54.015240 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:54 crc kubenswrapper[4717]: I1007 13:53:54.016458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:54 crc kubenswrapper[4717]: I1007 13:53:54.016499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:54 crc kubenswrapper[4717]: I1007 13:53:54.016515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:55 crc kubenswrapper[4717]: I1007 13:53:55.976580 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:55 crc kubenswrapper[4717]: I1007 13:53:55.976857 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:53:55 crc kubenswrapper[4717]: I1007 13:53:55.978370 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:53:55 crc kubenswrapper[4717]: I1007 13:53:55.978465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:53:55 crc kubenswrapper[4717]: I1007 13:53:55.978480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.425255 4717 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 07 13:53:58 crc kubenswrapper[4717]: E1007 13:53:58.559352 4717 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.573332 4717 trace.go:236] Trace[539002535]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:53:46.147) (total time: 12425ms): Oct 07 13:53:58 crc kubenswrapper[4717]: Trace[539002535]: ---"Objects listed" error: 12425ms (13:53:58.573) Oct 07 13:53:58 crc kubenswrapper[4717]: Trace[539002535]: [12.425696977s] [12.425696977s] END Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.573425 4717 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 07 13:53:58 crc kubenswrapper[4717]: E1007 13:53:58.573351 4717 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.575404 4717 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.575619 4717 trace.go:236] Trace[1027863579]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:53:46.870) (total time: 11705ms): Oct 07 13:53:58 crc kubenswrapper[4717]: Trace[1027863579]: ---"Objects listed" error: 11705ms (13:53:58.575) Oct 07 13:53:58 crc kubenswrapper[4717]: Trace[1027863579]: [11.705343274s] [11.705343274s] END Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.575649 4717 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.576236 4717 trace.go:236] Trace[273052044]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (07-Oct-2025 13:53:46.178) (total time: 12397ms): Oct 07 13:53:58 crc kubenswrapper[4717]: Trace[273052044]: ---"Objects listed" error: 12397ms (13:53:58.575) Oct 07 13:53:58 crc kubenswrapper[4717]: Trace[273052044]: [12.397507396s] [12.397507396s] END Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.576279 4717 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.790594 4717 apiserver.go:52] "Watching apiserver" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.801304 4717 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.801707 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.802199 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.802323 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.802391 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.802336 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:53:58 crc kubenswrapper[4717]: E1007 13:53:58.802424 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.802501 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:53:58 crc kubenswrapper[4717]: E1007 13:53:58.802525 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.802561 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:53:58 crc kubenswrapper[4717]: E1007 13:53:58.802606 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.805828 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.806233 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.806263 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.806568 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.806628 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.806572 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.806806 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.806930 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.807388 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.835419 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.845170 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.850447 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.850662 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.857731 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.865141 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.874868 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.898076 4717 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.899046 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.920242 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.931295 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.941315 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.955199 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.973298 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977211 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977288 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977323 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977458 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977500 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977527 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977557 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977582 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977606 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977630 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977654 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977676 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977708 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977731 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977753 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977776 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977769 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977785 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977801 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977915 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977942 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977963 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977980 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.977999 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978034 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978051 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978068 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978084 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978105 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978123 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978148 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978168 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978186 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978203 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978219 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978274 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978300 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978320 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978347 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978349 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978367 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978387 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978412 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978431 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978450 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978470 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978488 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978509 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978528 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978545 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978561 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978579 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978593 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978613 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978631 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978664 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978680 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978696 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978711 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978727 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978742 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978761 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978777 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978795 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978810 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978827 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978847 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978885 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978902 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978920 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978942 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978961 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978980 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979017 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979039 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979058 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979075 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979093 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979109 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979126 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979161 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979179 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979195 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979210 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979227 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979244 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979264 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979282 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979299 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979319 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979340 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979364 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979390 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979410 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979431 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979450 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979470 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979489 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979504 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979521 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979538 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979554 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979573 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979592 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979610 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979627 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979643 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979659 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979678 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979697 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979715 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979732 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979748 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979765 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979783 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979799 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979815 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979833 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979851 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979867 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979886 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979905 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979923 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979944 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979960 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979977 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979995 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980026 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980043 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980059 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980075 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980092 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980109 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980127 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980143 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980160 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980179 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980199 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980215 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980231 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980247 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980269 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980288 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980305 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980328 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980348 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980372 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980394 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980413 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980433 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980451 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980470 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980486 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980505 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980521 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980540 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980559 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980576 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980593 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980611 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980629 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980648 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980667 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980684 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980703 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980723 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980740 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980758 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980775 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980791 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980808 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980827 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980845 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980863 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980881 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980899 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980917 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980935 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980954 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980975 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980994 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981025 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981042 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981063 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981081 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981098 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981113 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981131 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981151 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981169 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981190 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981208 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981227 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981243 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981261 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981279 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981302 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.983076 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.989481 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.978774 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979237 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979453 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979710 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.979929 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980342 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.980962 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981117 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981170 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981385 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: E1007 13:53:58.981395 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:53:59.481359504 +0000 UTC m=+21.309285296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.997683 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.997767 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.997800 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.997832 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.997901 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.997931 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.997909 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.997963 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.997993 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998076 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998104 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998113 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998137 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998162 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998169 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998188 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998242 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998271 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998351 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998374 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998392 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998401 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998473 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998622 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998730 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.998917 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.999357 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.999377 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.999377 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981773 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.981911 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.982155 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.982432 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.982457 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.983062 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.983101 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.983288 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.983484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.983633 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.983709 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.984076 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.984120 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.984125 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.984141 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.984076 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.984600 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.984646 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.984652 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.984887 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.984926 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985065 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985088 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985313 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985408 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985438 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985461 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985488 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985501 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985574 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985782 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.985898 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.986122 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.986406 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.986368 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.986475 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.986826 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.987212 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.987278 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:58 crc kubenswrapper[4717]: I1007 13:53:58.987751 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.987772 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.987823 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.987849 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.987875 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.988172 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.988332 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.988524 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.988537 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.988708 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.989390 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.989497 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.989569 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.989596 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.989784 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.989845 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.990151 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.990452 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.990510 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.990529 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.990604 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.991053 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.991139 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.991355 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.991411 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.991580 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.991895 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.992173 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.992339 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.992482 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.992625 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.992692 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.992751 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.993210 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.993081 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.993409 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.993501 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.993539 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.993780 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.993810 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.994080 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.994112 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.994156 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.994180 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.995139 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.995224 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.995263 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.995513 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.995640 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.995702 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.995959 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.995780 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.996047 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.996285 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.996301 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.996361 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.996392 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.996567 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.996783 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.998885 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.999748 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.000181 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.000299 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.000504 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.000515 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.000547 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.000854 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.000889 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.000937 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.001190 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.001244 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.001443 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.001521 4717 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.001673 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.001738 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.001791 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.001837 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.001764 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.002119 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.002244 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.002491 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.002553 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.002603 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.002812 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:53:59.502775193 +0000 UTC m=+21.330700985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.002880 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.002837 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.002803 4717 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.003206 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.003387 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:58.981592 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.003456 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.003513 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.003711 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.004139 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.004524 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.004545 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.004780 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.005146 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.005598 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.012257 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.012387 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.012483 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.012872 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.013512 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.013608 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.011268 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.011413 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.011461 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.011765 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.011780 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.011890 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.012040 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.012219 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.005073 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.013782 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.014371 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.014688 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.014740 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.014731 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.014801 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.014818 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.015180 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:53:59.514987312 +0000 UTC m=+21.342913104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.015277 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:53:59.515201408 +0000 UTC m=+21.343127190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.016579 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.016862 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017519 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017573 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017590 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017592 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017604 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017647 4717 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017672 4717 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017687 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017698 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017708 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017722 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017770 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.017823 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.018241 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.018520 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.018642 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.018859 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.018894 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.018659 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.019138 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.019338 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.018939 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.019681 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.022056 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.022420 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.025702 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.026526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.026776 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.033523 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.034703 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.034738 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.034757 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.034825 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:53:59.534795889 +0000 UTC m=+21.362721681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.035414 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.037683 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.038057 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.038450 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.042973 4717 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.049774 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.052532 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.059401 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.059898 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.063097 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.072298 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.089748 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.104839 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.105974 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118678 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118751 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118817 4717 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118830 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118839 4717 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118848 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118858 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118866 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118875 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118886 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118894 4717 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118880 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118903 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118960 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.118983 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119071 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119084 4717 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119095 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119106 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119121 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119130 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119141 4717 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119151 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119160 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119169 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119177 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119185 4717 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119194 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119202 4717 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119211 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119219 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119227 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119238 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119247 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119255 4717 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119264 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119272 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119280 4717 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119291 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119300 4717 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119310 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119319 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119328 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119336 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119345 4717 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119355 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119363 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119372 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119380 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119391 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119399 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119408 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119419 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119428 4717 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119443 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119452 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119461 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119471 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119480 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119489 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119498 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119509 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119518 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119525 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119534 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119543 4717 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119552 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119560 4717 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119569 4717 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119578 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119587 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119596 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119604 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119612 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119620 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119629 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119637 4717 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119647 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119655 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119664 4717 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119673 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119681 4717 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119691 4717 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119700 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119709 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119728 4717 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119736 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119745 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119754 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119763 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119771 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119780 4717 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119788 4717 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119797 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119806 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119814 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119823 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119831 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119840 4717 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119848 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119857 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119865 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119875 4717 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119884 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119892 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119900 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119910 4717 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119921 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119929 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119938 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119946 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119954 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119962 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119971 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119980 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119989 4717 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.119997 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120032 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120041 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120051 4717 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120060 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120070 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120079 4717 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120089 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120097 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120106 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120114 4717 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120123 4717 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120130 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120139 4717 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120148 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120157 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120167 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120176 4717 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120185 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120194 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120205 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120214 4717 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120224 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120235 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120245 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120254 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120263 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120272 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120282 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120290 4717 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120299 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120307 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120315 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120324 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120333 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120342 4717 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120350 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120359 4717 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120372 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120380 4717 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120388 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120397 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120406 4717 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120415 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120424 4717 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120432 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120441 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120450 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120459 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120468 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120476 4717 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120484 4717 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120492 4717 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120501 4717 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120509 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120518 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120528 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120538 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120547 4717 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120557 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120566 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120576 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120585 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120594 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120604 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.120614 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.123336 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.123441 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.124291 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.131311 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.133018 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.135133 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.137166 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.152463 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: W1007 13:53:59.156106 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e2baf0d95da5f89b67dd77c71e1dd22add271a9d2445856aef1db6607ec7d6bf WatchSource:0}: Error finding container e2baf0d95da5f89b67dd77c71e1dd22add271a9d2445856aef1db6607ec7d6bf: Status 404 returned error can't find the container with id e2baf0d95da5f89b67dd77c71e1dd22add271a9d2445856aef1db6607ec7d6bf Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.168735 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.181560 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.200475 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.211581 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.226956 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.258154 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.273176 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.291234 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.313301 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.326174 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.335370 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.351755 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.524284 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.524466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.524573 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:54:00.524528425 +0000 UTC m=+22.352454217 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.524647 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.524676 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.524693 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.524676 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.524768 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:00.524744821 +0000 UTC m=+22.352670673 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.524814 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.524957 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:00.524948496 +0000 UTC m=+22.352874278 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.524966 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.525056 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.525092 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:00.525082389 +0000 UTC m=+22.353008181 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.626482 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.626752 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.626798 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.626812 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:53:59 crc kubenswrapper[4717]: E1007 13:53:59.626896 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:00.626871177 +0000 UTC m=+22.454796969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.951041 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-sn2rz"] Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.951550 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-shhlh"] Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.951799 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sn2rz" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.951851 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-shhlh" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.953969 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.954343 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.954511 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.954669 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.954788 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.954869 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.954901 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.955001 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.965141 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.977246 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 07 13:53:59 crc kubenswrapper[4717]: I1007 13:53:59.993017 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:53:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.007786 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.018701 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.029702 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75l4r\" (UniqueName: \"kubernetes.io/projected/f8d528a9-b0d4-49e6-b782-0e1e3ce36745-kube-api-access-75l4r\") pod \"node-resolver-sn2rz\" (UID: \"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\") " pod="openshift-dns/node-resolver-sn2rz" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.029751 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-var-lib-cni-bin\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.029769 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-daemon-config\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.029842 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-run-multus-certs\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.029860 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf0d43cd-2fb1-490e-9de4-db923141bd43-cni-binary-copy\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.029877 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-etc-kubernetes\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.029910 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-system-cni-dir\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.029961 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-var-lib-cni-multus\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030030 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-hostroot\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030055 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-cni-dir\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030076 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-run-k8s-cni-cncf-io\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030091 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-conf-dir\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030135 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-cnibin\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030156 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl82p\" (UniqueName: \"kubernetes.io/projected/bf0d43cd-2fb1-490e-9de4-db923141bd43-kube-api-access-xl82p\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030204 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f8d528a9-b0d4-49e6-b782-0e1e3ce36745-hosts-file\") pod \"node-resolver-sn2rz\" (UID: \"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\") " pod="openshift-dns/node-resolver-sn2rz" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030229 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-os-release\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030254 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-socket-dir-parent\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030271 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-run-netns\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.030314 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-var-lib-kubelet\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.038715 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35"} Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.038770 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cf6d588d6c7e24393812fdd2fceee04b10af097244282464d1f2b2d0b4a8a491"} Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.039524 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.040298 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7e0b26372667909341264918d01c4c6f4307dcc848490a2495fc899ed5bec7df"} Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.042490 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1"} Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.042948 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d"} Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.042989 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e2baf0d95da5f89b67dd77c71e1dd22add271a9d2445856aef1db6607ec7d6bf"} Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.058179 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.071907 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.093712 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.117423 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.127186 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131572 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-system-cni-dir\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131737 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-var-lib-cni-multus\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131758 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-hostroot\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131805 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-conf-dir\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131824 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-cni-dir\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131846 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-run-k8s-cni-cncf-io\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131866 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl82p\" (UniqueName: \"kubernetes.io/projected/bf0d43cd-2fb1-490e-9de4-db923141bd43-kube-api-access-xl82p\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131893 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-cnibin\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131912 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f8d528a9-b0d4-49e6-b782-0e1e3ce36745-hosts-file\") pod \"node-resolver-sn2rz\" (UID: \"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\") " pod="openshift-dns/node-resolver-sn2rz" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-os-release\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131983 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-socket-dir-parent\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132017 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-run-netns\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132040 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-var-lib-kubelet\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132100 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75l4r\" (UniqueName: \"kubernetes.io/projected/f8d528a9-b0d4-49e6-b782-0e1e3ce36745-kube-api-access-75l4r\") pod \"node-resolver-sn2rz\" (UID: \"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\") " pod="openshift-dns/node-resolver-sn2rz" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132122 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-var-lib-cni-bin\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132142 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-daemon-config\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132186 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-run-multus-certs\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132217 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf0d43cd-2fb1-490e-9de4-db923141bd43-cni-binary-copy\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132238 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-etc-kubernetes\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132294 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-etc-kubernetes\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.131700 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-system-cni-dir\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132354 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-var-lib-cni-multus\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132382 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-hostroot\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132639 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-var-lib-kubelet\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132697 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-conf-dir\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132737 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f8d528a9-b0d4-49e6-b782-0e1e3ce36745-hosts-file\") pod \"node-resolver-sn2rz\" (UID: \"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\") " pod="openshift-dns/node-resolver-sn2rz" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132870 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-cnibin\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.133055 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-cni-dir\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132967 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-os-release\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.132942 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-run-k8s-cni-cncf-io\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.133233 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-run-multus-certs\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.133269 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-socket-dir-parent\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.133329 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-run-netns\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.133367 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf0d43cd-2fb1-490e-9de4-db923141bd43-host-var-lib-cni-bin\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.133523 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf0d43cd-2fb1-490e-9de4-db923141bd43-multus-daemon-config\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.133941 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf0d43cd-2fb1-490e-9de4-db923141bd43-cni-binary-copy\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.134338 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.145352 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.154150 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75l4r\" (UniqueName: \"kubernetes.io/projected/f8d528a9-b0d4-49e6-b782-0e1e3ce36745-kube-api-access-75l4r\") pod \"node-resolver-sn2rz\" (UID: \"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\") " pod="openshift-dns/node-resolver-sn2rz" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.172955 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl82p\" (UniqueName: \"kubernetes.io/projected/bf0d43cd-2fb1-490e-9de4-db923141bd43-kube-api-access-xl82p\") pod \"multus-shhlh\" (UID: \"bf0d43cd-2fb1-490e-9de4-db923141bd43\") " pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.180306 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.180684 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.212024 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.231379 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.245876 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.257170 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.266459 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sn2rz" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.273363 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-shhlh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.277313 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: W1007 13:54:00.299816 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0d43cd_2fb1_490e_9de4_db923141bd43.slice/crio-b052a8e0515ad0131fc6b6aaef1f5d5ce04d01d2dcae6bfc3d002b8ccd2cb8c7 WatchSource:0}: Error finding container b052a8e0515ad0131fc6b6aaef1f5d5ce04d01d2dcae6bfc3d002b8ccd2cb8c7: Status 404 returned error can't find the container with id b052a8e0515ad0131fc6b6aaef1f5d5ce04d01d2dcae6bfc3d002b8ccd2cb8c7 Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.303092 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.334733 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.342098 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-znnjh"] Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.342984 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.343801 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-2f4zj"] Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.344031 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.345985 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lx6tg"] Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.346758 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.346885 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.347433 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.347558 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.347621 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.347752 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.348085 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.349966 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.351702 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.351871 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.353758 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.360220 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.360223 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.360223 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.360669 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.372812 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.402709 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.418943 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.434956 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435613 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-ovn\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435662 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-netd\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435679 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovn-node-metrics-cert\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435779 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-proxy-tls\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435815 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-netns\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435836 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-ovn-kubernetes\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435862 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-os-release\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfcfx\" (UniqueName: \"kubernetes.io/projected/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-kube-api-access-hfcfx\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435903 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-config\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-rootfs\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435944 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-var-lib-openvswitch\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435967 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-env-overrides\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.435997 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-bin\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436127 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-cnibin\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436173 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kff2m\" (UniqueName: \"kubernetes.io/projected/3fa5c083-f07a-43bb-adb4-724602f263b9-kube-api-access-kff2m\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436239 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436258 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kq2l\" (UniqueName: \"kubernetes.io/projected/d24a81c8-811e-41bf-ab5d-48590bc1e8df-kube-api-access-2kq2l\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-systemd\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436329 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-openvswitch\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436361 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fa5c083-f07a-43bb-adb4-724602f263b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436394 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-kubelet\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436415 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fa5c083-f07a-43bb-adb4-724602f263b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436436 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-system-cni-dir\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436451 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-node-log\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436532 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-etc-openvswitch\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436556 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-mcd-auth-proxy-config\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436575 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-systemd-units\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436591 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436609 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-script-lib\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436625 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-slash\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.436643 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-log-socket\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.451982 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.467928 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.482380 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.498987 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.514453 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.532935 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.537859 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.538184 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:54:02.538156607 +0000 UTC m=+24.366082409 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538218 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538257 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-proxy-tls\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538298 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-netns\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538327 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-ovn-kubernetes\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538355 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-os-release\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538383 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfcfx\" (UniqueName: \"kubernetes.io/projected/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-kube-api-access-hfcfx\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538407 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-config\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-env-overrides\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538457 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-rootfs\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-var-lib-openvswitch\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538517 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kff2m\" (UniqueName: \"kubernetes.io/projected/3fa5c083-f07a-43bb-adb4-724602f263b9-kube-api-access-kff2m\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538543 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-bin\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-cnibin\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538613 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kq2l\" (UniqueName: \"kubernetes.io/projected/d24a81c8-811e-41bf-ab5d-48590bc1e8df-kube-api-access-2kq2l\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538630 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-systemd\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-openvswitch\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538669 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fa5c083-f07a-43bb-adb4-724602f263b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538687 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-kubelet\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fa5c083-f07a-43bb-adb4-724602f263b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538727 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-node-log\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-system-cni-dir\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538781 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538801 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-etc-openvswitch\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538818 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-mcd-auth-proxy-config\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538852 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-script-lib\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538871 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-systemd-units\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538889 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538906 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-slash\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538924 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-log-socket\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538951 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-ovn\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538973 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-netd\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.538989 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovn-node-metrics-cert\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.539556 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.539608 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:02.539598435 +0000 UTC m=+24.367524237 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.539646 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-openvswitch\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.539964 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-rootfs\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540039 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-systemd\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.539982 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-netns\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540089 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-ovn-kubernetes\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540215 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-var-lib-openvswitch\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540348 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540500 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-slash\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540574 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3fa5c083-f07a-43bb-adb4-724602f263b9-cni-binary-copy\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540605 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-system-cni-dir\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540636 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-log-socket\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540658 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-kubelet\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540667 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-ovn\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540695 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-netd\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540732 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-systemd-units\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540765 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540796 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-bin\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540825 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-etc-openvswitch\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.540894 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540898 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-env-overrides\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.540912 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.540925 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.540947 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-node-log\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.540957 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:02.54094721 +0000 UTC m=+24.368873102 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.541122 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.541172 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-os-release\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.541192 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-mcd-auth-proxy-config\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.541243 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:02.541213387 +0000 UTC m=+24.369139229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.541278 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-config\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.541322 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3fa5c083-f07a-43bb-adb4-724602f263b9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.541355 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3fa5c083-f07a-43bb-adb4-724602f263b9-cnibin\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.543967 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-script-lib\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.544088 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-proxy-tls\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.544343 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovn-node-metrics-cert\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.553178 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.557742 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfcfx\" (UniqueName: \"kubernetes.io/projected/2f0e0c90-54cc-4aac-9c56-ad711d2d69a6-kube-api-access-hfcfx\") pod \"machine-config-daemon-2f4zj\" (UID: \"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\") " pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.560922 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kq2l\" (UniqueName: \"kubernetes.io/projected/d24a81c8-811e-41bf-ab5d-48590bc1e8df-kube-api-access-2kq2l\") pod \"ovnkube-node-lx6tg\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.560921 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kff2m\" (UniqueName: \"kubernetes.io/projected/3fa5c083-f07a-43bb-adb4-724602f263b9-kube-api-access-kff2m\") pod \"multus-additional-cni-plugins-znnjh\" (UID: \"3fa5c083-f07a-43bb-adb4-724602f263b9\") " pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.565106 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.586911 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.632935 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.639607 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.639796 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.639817 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.639831 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.639896 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:02.639873703 +0000 UTC m=+24.467799495 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.657111 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.666018 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-znnjh" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.672184 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.678766 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.683811 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:00 crc kubenswrapper[4717]: W1007 13:54:00.685305 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f0e0c90_54cc_4aac_9c56_ad711d2d69a6.slice/crio-792b829628ec2cd7063eca6c38d74b927c08881794694a37bd86e7b36392539f WatchSource:0}: Error finding container 792b829628ec2cd7063eca6c38d74b927c08881794694a37bd86e7b36392539f: Status 404 returned error can't find the container with id 792b829628ec2cd7063eca6c38d74b927c08881794694a37bd86e7b36392539f Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.697201 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.717512 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.746688 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.773963 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.786793 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.808891 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.827850 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.841895 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.868228 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.868306 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.868322 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.868445 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.868554 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:00 crc kubenswrapper[4717]: E1007 13:54:00.868664 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.873392 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.874235 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.875403 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.876127 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.877308 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.877942 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.878588 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.879692 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.880570 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.881621 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:00Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.885603 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.886324 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.887266 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.887839 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.888434 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.888980 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.889602 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.891755 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.892252 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.892952 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.893949 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.894510 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.895563 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.896075 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.897281 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.897794 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.898414 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.899621 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.900183 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.901282 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.901894 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.902982 4717 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.903153 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.905316 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.905958 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.906871 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.908601 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.910676 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.911981 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.912823 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.914411 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.915096 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.916458 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.917191 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.918441 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.918930 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.920052 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.920628 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.921791 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.924048 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.924617 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.925539 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.926350 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.927109 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 07 13:54:00 crc kubenswrapper[4717]: I1007 13:54:00.929210 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.047106 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sn2rz" event={"ID":"f8d528a9-b0d4-49e6-b782-0e1e3ce36745","Type":"ContainerStarted","Data":"db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.047171 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sn2rz" event={"ID":"f8d528a9-b0d4-49e6-b782-0e1e3ce36745","Type":"ContainerStarted","Data":"555892ee7b5a941d258db5b06e103a8291a3b61710b0cfdf4cad3a172bfc20c9"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.048610 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" event={"ID":"3fa5c083-f07a-43bb-adb4-724602f263b9","Type":"ContainerStarted","Data":"84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.048662 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" event={"ID":"3fa5c083-f07a-43bb-adb4-724602f263b9","Type":"ContainerStarted","Data":"ab7ba774e0e6cf67599f0d5f1c511c96e97d440241112fa1f53a27b373db3020"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.049996 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-shhlh" event={"ID":"bf0d43cd-2fb1-490e-9de4-db923141bd43","Type":"ContainerStarted","Data":"069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.050034 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-shhlh" event={"ID":"bf0d43cd-2fb1-490e-9de4-db923141bd43","Type":"ContainerStarted","Data":"b052a8e0515ad0131fc6b6aaef1f5d5ce04d01d2dcae6bfc3d002b8ccd2cb8c7"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.051625 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65" exitCode=0 Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.051707 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.051787 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"d80b15f81f211d766ee5da739eff9b66f4b4b1a380d50643eae287eb9f93f8e1"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.054343 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.054413 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.054429 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"792b829628ec2cd7063eca6c38d74b927c08881794694a37bd86e7b36392539f"} Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.065321 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.082874 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.100745 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.116474 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.144035 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.195083 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.245983 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.273326 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.292096 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.314909 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.334543 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.363220 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.416700 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.452242 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.483632 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.521446 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.564731 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.605888 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.647944 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.683072 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.726627 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.763280 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.802428 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.843332 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.890682 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.924429 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:01 crc kubenswrapper[4717]: I1007 13:54:01.962458 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:01Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.002430 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.065054 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc"} Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.066382 4717 generic.go:334] "Generic (PLEG): container finished" podID="3fa5c083-f07a-43bb-adb4-724602f263b9" containerID="84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080" exitCode=0 Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.066475 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" event={"ID":"3fa5c083-f07a-43bb-adb4-724602f263b9","Type":"ContainerDied","Data":"84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080"} Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.071513 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531"} Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.071565 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b"} Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.071579 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f"} Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.071590 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91"} Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.071601 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1"} Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.071610 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57"} Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.086646 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.115437 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.132312 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.163835 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.208714 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.241536 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.292195 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.293740 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-x869g"] Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.294191 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.313122 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.332446 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.353136 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.356791 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cps6g\" (UniqueName: \"kubernetes.io/projected/484b2799-a3d4-48d4-b7b3-46cd6aac9657-kube-api-access-cps6g\") pod \"node-ca-x869g\" (UID: \"484b2799-a3d4-48d4-b7b3-46cd6aac9657\") " pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.358116 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/484b2799-a3d4-48d4-b7b3-46cd6aac9657-serviceca\") pod \"node-ca-x869g\" (UID: \"484b2799-a3d4-48d4-b7b3-46cd6aac9657\") " pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.358171 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/484b2799-a3d4-48d4-b7b3-46cd6aac9657-host\") pod \"node-ca-x869g\" (UID: \"484b2799-a3d4-48d4-b7b3-46cd6aac9657\") " pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.372629 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.411149 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.442145 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.459026 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cps6g\" (UniqueName: \"kubernetes.io/projected/484b2799-a3d4-48d4-b7b3-46cd6aac9657-kube-api-access-cps6g\") pod \"node-ca-x869g\" (UID: \"484b2799-a3d4-48d4-b7b3-46cd6aac9657\") " pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.459136 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/484b2799-a3d4-48d4-b7b3-46cd6aac9657-serviceca\") pod \"node-ca-x869g\" (UID: \"484b2799-a3d4-48d4-b7b3-46cd6aac9657\") " pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.459180 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/484b2799-a3d4-48d4-b7b3-46cd6aac9657-host\") pod \"node-ca-x869g\" (UID: \"484b2799-a3d4-48d4-b7b3-46cd6aac9657\") " pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.459271 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/484b2799-a3d4-48d4-b7b3-46cd6aac9657-host\") pod \"node-ca-x869g\" (UID: \"484b2799-a3d4-48d4-b7b3-46cd6aac9657\") " pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.460546 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/484b2799-a3d4-48d4-b7b3-46cd6aac9657-serviceca\") pod \"node-ca-x869g\" (UID: \"484b2799-a3d4-48d4-b7b3-46cd6aac9657\") " pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.482252 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.508770 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cps6g\" (UniqueName: \"kubernetes.io/projected/484b2799-a3d4-48d4-b7b3-46cd6aac9657-kube-api-access-cps6g\") pod \"node-ca-x869g\" (UID: \"484b2799-a3d4-48d4-b7b3-46cd6aac9657\") " pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.542278 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.559477 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.559653 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:54:06.559626162 +0000 UTC m=+28.387551954 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.559711 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.559768 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.559826 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.559835 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.559870 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:06.559861628 +0000 UTC m=+28.387787420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.559941 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.559955 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.559972 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.559988 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.560038 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:06.560026132 +0000 UTC m=+28.387951924 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.560059 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:06.560051323 +0000 UTC m=+28.387977115 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.582178 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.624659 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.661247 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.661556 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.661747 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.661773 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.661785 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.661857 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:06.66183862 +0000 UTC m=+28.489764412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.701834 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.741549 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.784821 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.807457 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-x869g" Oct 07 13:54:02 crc kubenswrapper[4717]: W1007 13:54:02.824564 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod484b2799_a3d4_48d4_b7b3_46cd6aac9657.slice/crio-cb7b8860873a595f7a93f2440457c7d664bf6d4199301a97c1e5f949b2638d30 WatchSource:0}: Error finding container cb7b8860873a595f7a93f2440457c7d664bf6d4199301a97c1e5f949b2638d30: Status 404 returned error can't find the container with id cb7b8860873a595f7a93f2440457c7d664bf6d4199301a97c1e5f949b2638d30 Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.829572 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.868172 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.868287 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.868340 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.868385 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.868533 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:02 crc kubenswrapper[4717]: E1007 13:54:02.868623 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.876192 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.923705 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.945441 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:02 crc kubenswrapper[4717]: I1007 13:54:02.981112 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:02Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.021280 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.062661 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.078216 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x869g" event={"ID":"484b2799-a3d4-48d4-b7b3-46cd6aac9657","Type":"ContainerStarted","Data":"4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623"} Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.078286 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-x869g" event={"ID":"484b2799-a3d4-48d4-b7b3-46cd6aac9657","Type":"ContainerStarted","Data":"cb7b8860873a595f7a93f2440457c7d664bf6d4199301a97c1e5f949b2638d30"} Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.080732 4717 generic.go:334] "Generic (PLEG): container finished" podID="3fa5c083-f07a-43bb-adb4-724602f263b9" containerID="8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a" exitCode=0 Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.080787 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" event={"ID":"3fa5c083-f07a-43bb-adb4-724602f263b9","Type":"ContainerDied","Data":"8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a"} Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.100348 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.143668 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.185973 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.223959 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.262924 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.300747 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.341413 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.383028 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.422051 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.460662 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.500317 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.541939 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.582591 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.625317 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.661793 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.701732 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.743122 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.780804 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.821356 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:03 crc kubenswrapper[4717]: I1007 13:54:03.868586 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.089375 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84"} Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.091821 4717 generic.go:334] "Generic (PLEG): container finished" podID="3fa5c083-f07a-43bb-adb4-724602f263b9" containerID="ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9" exitCode=0 Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.091897 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" event={"ID":"3fa5c083-f07a-43bb-adb4-724602f263b9","Type":"ContainerDied","Data":"ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9"} Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.108095 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.120127 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.142452 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.156168 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.171281 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.185799 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.198793 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.213282 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.228463 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.262242 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.301140 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.340776 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.380179 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.424063 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.467725 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.867965 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:04 crc kubenswrapper[4717]: E1007 13:54:04.868201 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.868676 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.868888 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:04 crc kubenswrapper[4717]: E1007 13:54:04.868990 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:04 crc kubenswrapper[4717]: E1007 13:54:04.869184 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.974476 4717 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.976649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.976694 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.976708 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.976852 4717 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.982415 4717 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.982746 4717 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.984101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.984238 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.984335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.984436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:04 crc kubenswrapper[4717]: I1007 13:54:04.984522 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:04Z","lastTransitionTime":"2025-10-07T13:54:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:04 crc kubenswrapper[4717]: E1007 13:54:04.996674 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.001965 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.002045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.002055 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.002075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.002087 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: E1007 13:54:05.017710 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.021693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.021835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.021897 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.021969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.023196 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: E1007 13:54:05.035832 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.040133 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.040202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.040215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.040235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.040246 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: E1007 13:54:05.054211 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.062183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.062237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.062278 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.062302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.062316 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: E1007 13:54:05.074485 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: E1007 13:54:05.074673 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.076952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.076993 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.077022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.077043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.077054 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.101832 4717 generic.go:334] "Generic (PLEG): container finished" podID="3fa5c083-f07a-43bb-adb4-724602f263b9" containerID="b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc" exitCode=0 Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.101892 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" event={"ID":"3fa5c083-f07a-43bb-adb4-724602f263b9","Type":"ContainerDied","Data":"b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc"} Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.117884 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.139570 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.164477 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.181069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.181123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.181134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.181155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.181166 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.183872 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.204263 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.216737 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.230308 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.241564 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.257152 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.276470 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.283437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.283490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.283503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.283523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.283534 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.296887 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.311036 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.324101 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.333663 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.346126 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.386898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.387000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.387035 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.387064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.387079 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.489953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.490031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.490045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.490067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.490082 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.592552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.592612 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.592634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.592654 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.592665 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.695270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.695308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.695318 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.695336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.695347 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.798827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.798882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.798900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.798927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.798942 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.901853 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.901892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.901903 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.901923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.901934 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:05Z","lastTransitionTime":"2025-10-07T13:54:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.980303 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 07 13:54:05 crc kubenswrapper[4717]: I1007 13:54:05.999315 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.004866 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.004921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.004932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.004953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.004966 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:06Z","lastTransitionTime":"2025-10-07T13:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.013063 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.025942 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.040900 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.054229 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.069148 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.084610 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.101229 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.107077 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.107123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.107132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.107151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.107200 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:06Z","lastTransitionTime":"2025-10-07T13:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.108826 4717 generic.go:334] "Generic (PLEG): container finished" podID="3fa5c083-f07a-43bb-adb4-724602f263b9" containerID="90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed" exitCode=0 Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.108913 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" event={"ID":"3fa5c083-f07a-43bb-adb4-724602f263b9","Type":"ContainerDied","Data":"90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed"} Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.117245 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.140749 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.153974 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.165801 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.188607 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.202601 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.210033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.210092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.210105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.210121 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.210138 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:06Z","lastTransitionTime":"2025-10-07T13:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.218967 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.232327 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.244698 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.260363 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.273181 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.287681 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.301585 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.313214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.313275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.313288 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.313310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.313326 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:06Z","lastTransitionTime":"2025-10-07T13:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.316266 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.330441 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.345424 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.364769 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.384720 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.399914 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.414852 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.416895 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.416937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.416949 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.416964 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.417022 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:06Z","lastTransitionTime":"2025-10-07T13:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.426825 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.437545 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.519749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.519794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.519810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.519832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.519847 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:06Z","lastTransitionTime":"2025-10-07T13:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.603581 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.603751 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.603791 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.603832 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:54:14.603791933 +0000 UTC m=+36.431717725 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.603927 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.603939 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.603970 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.603988 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.604026 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:14.603982118 +0000 UTC m=+36.431907900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.604040 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.603945 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.604046 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:14.604038119 +0000 UTC m=+36.431964011 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.604135 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:14.604125882 +0000 UTC m=+36.432051674 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.622501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.622574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.622588 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.622608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.622622 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:06Z","lastTransitionTime":"2025-10-07T13:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.705650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.705848 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.705868 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.705881 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.705947 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:14.705929299 +0000 UTC m=+36.533855091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.725202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.725259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.725273 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.725292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.725303 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:06Z","lastTransitionTime":"2025-10-07T13:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.828500 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.828559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.828571 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.828591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.828612 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:06Z","lastTransitionTime":"2025-10-07T13:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.868190 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.868279 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.868395 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.868293 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.868493 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:06 crc kubenswrapper[4717]: E1007 13:54:06.868622 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.931586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.931641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.931651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.931689 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:06 crc kubenswrapper[4717]: I1007 13:54:06.931701 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:06Z","lastTransitionTime":"2025-10-07T13:54:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.034642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.034702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.034713 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.034735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.034749 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:07Z","lastTransitionTime":"2025-10-07T13:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.116451 4717 generic.go:334] "Generic (PLEG): container finished" podID="3fa5c083-f07a-43bb-adb4-724602f263b9" containerID="785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e" exitCode=0 Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.116499 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" event={"ID":"3fa5c083-f07a-43bb-adb4-724602f263b9","Type":"ContainerDied","Data":"785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.122538 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.122868 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.138615 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.138806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.139237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.139255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.139281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.139294 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:07Z","lastTransitionTime":"2025-10-07T13:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.157643 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.158330 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.181799 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.200527 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.212729 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.227776 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.241683 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.241735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.241749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.241770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.241783 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:07Z","lastTransitionTime":"2025-10-07T13:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.248447 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.262376 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.276581 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.294227 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.305947 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.318811 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.332647 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.344321 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.344746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.344779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.344792 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.344813 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.344827 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:07Z","lastTransitionTime":"2025-10-07T13:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.355846 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.368217 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.380859 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.394902 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.418314 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.431705 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.443165 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.447710 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.447759 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.447779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.447800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.447816 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:07Z","lastTransitionTime":"2025-10-07T13:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.465058 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.478317 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.491519 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.504941 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.517576 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.530703 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.544629 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.550754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.550795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.550806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.550827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.550840 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:07Z","lastTransitionTime":"2025-10-07T13:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.558127 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.571814 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:07Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.653468 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.653523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.653534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.653553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.653566 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:07Z","lastTransitionTime":"2025-10-07T13:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.756125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.756173 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.756184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.756203 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.756217 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:07Z","lastTransitionTime":"2025-10-07T13:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.859663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.859737 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.859750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.859774 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.859790 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:07Z","lastTransitionTime":"2025-10-07T13:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.962989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.963101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.963116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.963140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:07 crc kubenswrapper[4717]: I1007 13:54:07.963155 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:07Z","lastTransitionTime":"2025-10-07T13:54:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.065674 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.065721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.065729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.065746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.065758 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:08Z","lastTransitionTime":"2025-10-07T13:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.131089 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" event={"ID":"3fa5c083-f07a-43bb-adb4-724602f263b9","Type":"ContainerStarted","Data":"29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.131228 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.131837 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.147150 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.157341 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.160967 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.168179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.168214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.168225 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.168243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.168256 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:08Z","lastTransitionTime":"2025-10-07T13:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.179320 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.202744 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.224130 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.255287 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.270226 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.271316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.271361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.271372 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.271392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.271406 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:08Z","lastTransitionTime":"2025-10-07T13:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.283280 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.294255 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.306122 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.320142 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.334701 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.347807 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.359614 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.371147 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.373977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.374052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.374070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.374089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.374099 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:08Z","lastTransitionTime":"2025-10-07T13:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.383837 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.395775 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.407704 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.427296 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.439709 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.451645 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.465593 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.476606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.476649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.476660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.476681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.476692 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:08Z","lastTransitionTime":"2025-10-07T13:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.483898 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.502717 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.531886 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.561901 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.579503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.579552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.579564 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.579584 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.579599 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:08Z","lastTransitionTime":"2025-10-07T13:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.589950 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.624282 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.659502 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.682653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.682709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.682720 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.682742 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.682754 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:08Z","lastTransitionTime":"2025-10-07T13:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.701613 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.785293 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.785368 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.785385 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.785404 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.785415 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:08Z","lastTransitionTime":"2025-10-07T13:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.867634 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.867722 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:08 crc kubenswrapper[4717]: E1007 13:54:08.867819 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.868040 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:08 crc kubenswrapper[4717]: E1007 13:54:08.868094 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:08 crc kubenswrapper[4717]: E1007 13:54:08.868121 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.882491 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.887740 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.887787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.887798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.887816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.887828 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:08Z","lastTransitionTime":"2025-10-07T13:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.895568 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.912763 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.935588 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.946955 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.959918 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.991299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.991357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.991371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.991390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.991402 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:08Z","lastTransitionTime":"2025-10-07T13:54:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:08 crc kubenswrapper[4717]: I1007 13:54:08.992872 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.022697 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.066890 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.094665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.094722 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.094732 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.094753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.094766 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:09Z","lastTransitionTime":"2025-10-07T13:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.103357 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.135601 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/0.log" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.138783 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74" exitCode=1 Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.138838 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74"} Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.139835 4717 scope.go:117] "RemoveContainer" containerID="d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.146695 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.183186 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.197556 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.197619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.197632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.197652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.197666 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:09Z","lastTransitionTime":"2025-10-07T13:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.223806 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.262446 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.301157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.301210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.301224 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.301246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.301260 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:09Z","lastTransitionTime":"2025-10-07T13:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.304564 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.352445 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:54:09.091781 5963 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 13:54:09.091824 5963 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:54:09.091836 5963 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 13:54:09.091856 5963 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 13:54:09.091859 5963 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 13:54:09.091878 5963 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:54:09.091891 5963 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:54:09.091899 5963 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 13:54:09.091878 5963 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:54:09.091964 5963 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:54:09.091978 5963 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 13:54:09.091984 5963 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 13:54:09.091995 5963 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:54:09.092035 5963 factory.go:656] Stopping watch factory\\\\nI1007 13:54:09.092049 5963 ovnkube.go:599] Stopped ovnkube\\\\nI1007 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.384654 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.404245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.404298 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.404310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.404328 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.404339 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:09Z","lastTransitionTime":"2025-10-07T13:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.423187 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.467375 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.502913 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.507033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.507093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.507107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.507134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.507148 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:09Z","lastTransitionTime":"2025-10-07T13:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.541798 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.580468 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.609834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.609893 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.609904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.609922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.609933 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:09Z","lastTransitionTime":"2025-10-07T13:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.628364 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.661984 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.702303 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.712887 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.712933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.712948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.712970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.713020 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:09Z","lastTransitionTime":"2025-10-07T13:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.740748 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.780796 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.816156 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.816328 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.816339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.816358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.816368 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:09Z","lastTransitionTime":"2025-10-07T13:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.822133 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.864055 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.902116 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.919044 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.919105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.919120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.919144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:09 crc kubenswrapper[4717]: I1007 13:54:09.919159 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:09Z","lastTransitionTime":"2025-10-07T13:54:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.021863 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.021918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.021929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.021950 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.021964 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:10Z","lastTransitionTime":"2025-10-07T13:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.124958 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.125027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.125040 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.125056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.125067 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:10Z","lastTransitionTime":"2025-10-07T13:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.143123 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/1.log" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.143659 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/0.log" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.146148 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd" exitCode=1 Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.146195 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.146263 4717 scope.go:117] "RemoveContainer" containerID="d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.146822 4717 scope.go:117] "RemoveContainer" containerID="597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd" Oct 07 13:54:10 crc kubenswrapper[4717]: E1007 13:54:10.147064 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.161371 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.173893 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.192024 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.204520 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.217088 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.228648 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.228705 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.228717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.228740 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.228753 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:10Z","lastTransitionTime":"2025-10-07T13:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.232739 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.246602 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.257915 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.272112 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.306430 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d87bddb2fa7cf5fb0597a09bbc428e561f9ff80d0d9c0f4a311e441fc98d4d74\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1007 13:54:09.091781 5963 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1007 13:54:09.091824 5963 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1007 13:54:09.091836 5963 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1007 13:54:09.091856 5963 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1007 13:54:09.091859 5963 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1007 13:54:09.091878 5963 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:54:09.091891 5963 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:54:09.091899 5963 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1007 13:54:09.091878 5963 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1007 13:54:09.091964 5963 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:54:09.091978 5963 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1007 13:54:09.091984 5963 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1007 13:54:09.091995 5963 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:54:09.092035 5963 factory.go:656] Stopping watch factory\\\\nI1007 13:54:09.092049 5963 ovnkube.go:599] Stopped ovnkube\\\\nI1007 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"ds:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:54:09.914704 6128 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:54:09.914744 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.331685 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.331730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.331746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.331762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.331775 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:10Z","lastTransitionTime":"2025-10-07T13:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.347515 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.380634 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.422374 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.433904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.433958 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.433971 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.433991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.434023 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:10Z","lastTransitionTime":"2025-10-07T13:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.459082 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.500570 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:10Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.536922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.536968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.536981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.537020 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.537035 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:10Z","lastTransitionTime":"2025-10-07T13:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.640534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.640595 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.640606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.640628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.640643 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:10Z","lastTransitionTime":"2025-10-07T13:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.743263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.743314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.743330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.743352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.743367 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:10Z","lastTransitionTime":"2025-10-07T13:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.846968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.847082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.847105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.847134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.847149 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:10Z","lastTransitionTime":"2025-10-07T13:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.868291 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:10 crc kubenswrapper[4717]: E1007 13:54:10.868431 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.868499 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:10 crc kubenswrapper[4717]: E1007 13:54:10.868606 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.868307 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:10 crc kubenswrapper[4717]: E1007 13:54:10.868732 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.950111 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.950162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.950174 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.950195 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:10 crc kubenswrapper[4717]: I1007 13:54:10.950211 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:10Z","lastTransitionTime":"2025-10-07T13:54:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.053279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.053326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.053339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.053358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.053370 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:11Z","lastTransitionTime":"2025-10-07T13:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.152330 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/1.log" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.155650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.155699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.155712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.155733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.155749 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:11Z","lastTransitionTime":"2025-10-07T13:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.156957 4717 scope.go:117] "RemoveContainer" containerID="597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd" Oct 07 13:54:11 crc kubenswrapper[4717]: E1007 13:54:11.157178 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.176088 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"ds:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:54:09.914704 6128 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:54:09.914744 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.187980 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.200566 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.214445 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.227451 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.238554 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.248746 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.259429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.259523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.259545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.259568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.259589 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:11Z","lastTransitionTime":"2025-10-07T13:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.271818 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.285095 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.298480 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.311447 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.325059 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.337607 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.352489 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.362785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.362817 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.362828 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.362844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.362856 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:11Z","lastTransitionTime":"2025-10-07T13:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.367751 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:11Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.464834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.464885 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.464901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.464923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.464934 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:11Z","lastTransitionTime":"2025-10-07T13:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.567540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.567594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.567612 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.567639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.567659 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:11Z","lastTransitionTime":"2025-10-07T13:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.670234 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.670297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.670317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.670345 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.670362 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:11Z","lastTransitionTime":"2025-10-07T13:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.774561 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.774616 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.774627 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.774649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.774661 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:11Z","lastTransitionTime":"2025-10-07T13:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.877286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.877362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.877381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.877415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.877439 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:11Z","lastTransitionTime":"2025-10-07T13:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.980325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.980377 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.980392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.980422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:11 crc kubenswrapper[4717]: I1007 13:54:11.980436 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:11Z","lastTransitionTime":"2025-10-07T13:54:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.083334 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.083380 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.083389 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.083407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.083418 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:12Z","lastTransitionTime":"2025-10-07T13:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.186815 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.186895 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.186907 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.186946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.186961 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:12Z","lastTransitionTime":"2025-10-07T13:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.289922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.289993 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.290031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.290059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.290070 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:12Z","lastTransitionTime":"2025-10-07T13:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.394154 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.394235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.394246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.394268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.394317 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:12Z","lastTransitionTime":"2025-10-07T13:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.496961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.497031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.497044 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.497064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.497082 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:12Z","lastTransitionTime":"2025-10-07T13:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.599653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.599694 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.599703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.599720 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.599731 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:12Z","lastTransitionTime":"2025-10-07T13:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.702825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.702912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.702942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.702977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.703045 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:12Z","lastTransitionTime":"2025-10-07T13:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.805841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.805882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.805893 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.805909 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.805919 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:12Z","lastTransitionTime":"2025-10-07T13:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.867520 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.867571 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.867624 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:12 crc kubenswrapper[4717]: E1007 13:54:12.867738 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:12 crc kubenswrapper[4717]: E1007 13:54:12.867818 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:12 crc kubenswrapper[4717]: E1007 13:54:12.868184 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.909198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.909275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.909296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.909324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.909342 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:12Z","lastTransitionTime":"2025-10-07T13:54:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.928396 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v"] Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.928949 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.931329 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.931833 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.945810 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.958103 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.970414 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.983290 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:12 crc kubenswrapper[4717]: I1007 13:54:12.996184 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:12Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.008867 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.012357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.012392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.012407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.012425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.012435 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:13Z","lastTransitionTime":"2025-10-07T13:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.021045 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.034173 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.046734 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.061869 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.079075 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.079135 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.079161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.079211 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnbrb\" (UniqueName: \"kubernetes.io/projected/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-kube-api-access-wnbrb\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.082407 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"ds:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:54:09.914704 6128 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:54:09.914744 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.094573 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.105266 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.114867 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.114914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.114924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.114943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.114956 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:13Z","lastTransitionTime":"2025-10-07T13:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.122945 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.135175 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.147887 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:13Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.180744 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnbrb\" (UniqueName: \"kubernetes.io/projected/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-kube-api-access-wnbrb\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.180809 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.180833 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.180860 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.181481 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.181912 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.188398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.198946 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnbrb\" (UniqueName: \"kubernetes.io/projected/4dbc14f8-cf99-4ab1-ac35-06266f9b154e-kube-api-access-wnbrb\") pod \"ovnkube-control-plane-749d76644c-zj22v\" (UID: \"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.217969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.218027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.218037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.218056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.218066 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:13Z","lastTransitionTime":"2025-10-07T13:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.246897 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" Oct 07 13:54:13 crc kubenswrapper[4717]: W1007 13:54:13.263205 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dbc14f8_cf99_4ab1_ac35_06266f9b154e.slice/crio-12b6e716c711d21b5afb4988597dd95ce59b97226eab28b61de69768d7c3b9f0 WatchSource:0}: Error finding container 12b6e716c711d21b5afb4988597dd95ce59b97226eab28b61de69768d7c3b9f0: Status 404 returned error can't find the container with id 12b6e716c711d21b5afb4988597dd95ce59b97226eab28b61de69768d7c3b9f0 Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.322130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.322171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.322180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.322199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.322209 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:13Z","lastTransitionTime":"2025-10-07T13:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.425262 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.425319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.425332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.425354 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.425369 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:13Z","lastTransitionTime":"2025-10-07T13:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.527507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.527562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.527594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.527613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.527624 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:13Z","lastTransitionTime":"2025-10-07T13:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.630669 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.631037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.631052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.631071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.631086 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:13Z","lastTransitionTime":"2025-10-07T13:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.733892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.734288 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.734376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.734460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.734530 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:13Z","lastTransitionTime":"2025-10-07T13:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.836961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.837017 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.837031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.837050 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.837093 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:13Z","lastTransitionTime":"2025-10-07T13:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.940349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.940668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.940779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.940951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:13 crc kubenswrapper[4717]: I1007 13:54:13.941129 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:13Z","lastTransitionTime":"2025-10-07T13:54:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.028330 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vl8rk"] Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.029044 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.029164 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.043768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.043822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.043833 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.043849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.043860 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:14Z","lastTransitionTime":"2025-10-07T13:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.045339 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.062080 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.079615 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.091995 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.110955 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.125184 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.141555 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.146034 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.146083 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.146100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.146125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.146142 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:14Z","lastTransitionTime":"2025-10-07T13:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.157272 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.167109 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" event={"ID":"4dbc14f8-cf99-4ab1-ac35-06266f9b154e","Type":"ContainerStarted","Data":"8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.167170 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" event={"ID":"4dbc14f8-cf99-4ab1-ac35-06266f9b154e","Type":"ContainerStarted","Data":"519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.167184 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" event={"ID":"4dbc14f8-cf99-4ab1-ac35-06266f9b154e","Type":"ContainerStarted","Data":"12b6e716c711d21b5afb4988597dd95ce59b97226eab28b61de69768d7c3b9f0"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.175578 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.192640 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.195310 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.195404 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vztg\" (UniqueName: \"kubernetes.io/projected/004bf989-60a1-4a45-bb4d-fc6a41829f3d-kube-api-access-9vztg\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.223435 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.246462 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.248821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.248869 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.248886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.248906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.248919 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:14Z","lastTransitionTime":"2025-10-07T13:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.268998 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.285111 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.297041 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vztg\" (UniqueName: \"kubernetes.io/projected/004bf989-60a1-4a45-bb4d-fc6a41829f3d-kube-api-access-9vztg\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.297275 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.297432 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.297505 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs podName:004bf989-60a1-4a45-bb4d-fc6a41829f3d nodeName:}" failed. No retries permitted until 2025-10-07 13:54:14.797483212 +0000 UTC m=+36.625409004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs") pod "network-metrics-daemon-vl8rk" (UID: "004bf989-60a1-4a45-bb4d-fc6a41829f3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.306040 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"ds:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:54:09.914704 6128 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:54:09.914744 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.316475 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vztg\" (UniqueName: \"kubernetes.io/projected/004bf989-60a1-4a45-bb4d-fc6a41829f3d-kube-api-access-9vztg\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.322503 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.340735 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.351829 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.351885 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.351903 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.351926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.351937 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:14Z","lastTransitionTime":"2025-10-07T13:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.356135 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.369388 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.380244 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.391526 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.402054 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.413455 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.426897 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.445387 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"ds:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:54:09.914704 6128 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:54:09.914744 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.454542 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.454604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.454617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.454639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.454652 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:14Z","lastTransitionTime":"2025-10-07T13:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.459294 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.471924 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.483692 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.497332 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.507468 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.518736 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.541235 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.554825 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.557415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.557533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.557625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.557726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.557820 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:14Z","lastTransitionTime":"2025-10-07T13:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.568299 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:14Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.660666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.660735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.660749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.660768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.660780 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:14Z","lastTransitionTime":"2025-10-07T13:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.701610 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.701804 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.701849 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.701900 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:54:30.701848388 +0000 UTC m=+52.529774220 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.701992 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.702028 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.702085 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.702109 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:30.702094804 +0000 UTC m=+52.530020596 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.702197 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.702037 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.702260 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:30.702242108 +0000 UTC m=+52.530167900 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.702264 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.702332 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:30.70231784 +0000 UTC m=+52.530243722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.763151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.763203 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.763215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.763238 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.763253 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:14Z","lastTransitionTime":"2025-10-07T13:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.803547 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.803609 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.803800 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.803814 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.803949 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs podName:004bf989-60a1-4a45-bb4d-fc6a41829f3d nodeName:}" failed. No retries permitted until 2025-10-07 13:54:15.803918663 +0000 UTC m=+37.631844505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs") pod "network-metrics-daemon-vl8rk" (UID: "004bf989-60a1-4a45-bb4d-fc6a41829f3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.803832 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.803988 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.804077 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:54:30.804054616 +0000 UTC m=+52.631980448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.866403 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.866438 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.866448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.866465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.866476 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:14Z","lastTransitionTime":"2025-10-07T13:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.867994 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.868121 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.868198 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.868259 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.868845 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:14 crc kubenswrapper[4717]: E1007 13:54:14.868961 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.972935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.973099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.973165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.973189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:14 crc kubenswrapper[4717]: I1007 13:54:14.973203 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:14Z","lastTransitionTime":"2025-10-07T13:54:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.076611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.076675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.076686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.076705 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.076717 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.178653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.178738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.178750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.178766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.178778 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.281973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.282085 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.282105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.282136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.282158 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.385266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.385342 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.385365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.385396 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.385420 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.390758 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.390800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.390816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.390833 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.390846 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: E1007 13:54:15.407593 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.411829 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.411873 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.411883 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.411924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.411942 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: E1007 13:54:15.427926 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.431889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.431946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.431961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.431981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.431996 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: E1007 13:54:15.447534 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.451729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.451788 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.451802 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.451824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.451838 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: E1007 13:54:15.466320 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.470926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.470984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.470996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.471047 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.471061 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: E1007 13:54:15.488549 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:15Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:15 crc kubenswrapper[4717]: E1007 13:54:15.488685 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.490605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.490635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.490644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.490659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.490671 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.594115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.594169 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.594182 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.594202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.594215 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.696889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.696952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.696966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.696991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.697045 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.799781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.799823 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.799831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.799847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.799858 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.815808 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:15 crc kubenswrapper[4717]: E1007 13:54:15.816155 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:15 crc kubenswrapper[4717]: E1007 13:54:15.816306 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs podName:004bf989-60a1-4a45-bb4d-fc6a41829f3d nodeName:}" failed. No retries permitted until 2025-10-07 13:54:17.816273082 +0000 UTC m=+39.644199084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs") pod "network-metrics-daemon-vl8rk" (UID: "004bf989-60a1-4a45-bb4d-fc6a41829f3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.868450 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:15 crc kubenswrapper[4717]: E1007 13:54:15.868651 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.902587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.902653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.902670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.902695 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:15 crc kubenswrapper[4717]: I1007 13:54:15.902709 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:15Z","lastTransitionTime":"2025-10-07T13:54:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.005720 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.005797 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.005809 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.005833 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.005847 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:16Z","lastTransitionTime":"2025-10-07T13:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.109299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.109349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.109365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.109388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.109405 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:16Z","lastTransitionTime":"2025-10-07T13:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.212454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.212516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.212535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.212557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.212570 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:16Z","lastTransitionTime":"2025-10-07T13:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.315442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.315480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.315491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.315511 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.315524 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:16Z","lastTransitionTime":"2025-10-07T13:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.418357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.418402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.418413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.418433 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.418448 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:16Z","lastTransitionTime":"2025-10-07T13:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.521555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.521596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.521606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.521622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.521632 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:16Z","lastTransitionTime":"2025-10-07T13:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.624415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.624462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.624471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.624488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.624502 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:16Z","lastTransitionTime":"2025-10-07T13:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.727160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.727224 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.727236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.727254 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.727265 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:16Z","lastTransitionTime":"2025-10-07T13:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.830200 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.830255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.830267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.830290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.830304 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:16Z","lastTransitionTime":"2025-10-07T13:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.867752 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.867790 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.867872 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:16 crc kubenswrapper[4717]: E1007 13:54:16.867949 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:16 crc kubenswrapper[4717]: E1007 13:54:16.868124 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:16 crc kubenswrapper[4717]: E1007 13:54:16.868307 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.932663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.932715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.932727 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.932747 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:16 crc kubenswrapper[4717]: I1007 13:54:16.932760 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:16Z","lastTransitionTime":"2025-10-07T13:54:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.035724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.035776 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.035792 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.035816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.035832 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:17Z","lastTransitionTime":"2025-10-07T13:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.140294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.140347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.140357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.140374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.140384 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:17Z","lastTransitionTime":"2025-10-07T13:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.242583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.243287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.243339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.243370 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.243399 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:17Z","lastTransitionTime":"2025-10-07T13:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.346986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.347100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.347120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.347166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.347193 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:17Z","lastTransitionTime":"2025-10-07T13:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.450038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.450096 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.450114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.450134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.450147 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:17Z","lastTransitionTime":"2025-10-07T13:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.552912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.552953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.552961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.552975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.552987 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:17Z","lastTransitionTime":"2025-10-07T13:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.656350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.656404 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.656422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.656446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.656465 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:17Z","lastTransitionTime":"2025-10-07T13:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.759864 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.759931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.759946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.759968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.759982 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:17Z","lastTransitionTime":"2025-10-07T13:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.839942 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:17 crc kubenswrapper[4717]: E1007 13:54:17.840231 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:17 crc kubenswrapper[4717]: E1007 13:54:17.840361 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs podName:004bf989-60a1-4a45-bb4d-fc6a41829f3d nodeName:}" failed. No retries permitted until 2025-10-07 13:54:21.840320775 +0000 UTC m=+43.668246657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs") pod "network-metrics-daemon-vl8rk" (UID: "004bf989-60a1-4a45-bb4d-fc6a41829f3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.862732 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.862798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.862810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.862832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.862846 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:17Z","lastTransitionTime":"2025-10-07T13:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.868001 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:17 crc kubenswrapper[4717]: E1007 13:54:17.868182 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.965480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.965557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.965575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.965609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:17 crc kubenswrapper[4717]: I1007 13:54:17.965633 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:17Z","lastTransitionTime":"2025-10-07T13:54:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.069064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.069159 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.069175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.069199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.069213 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:18Z","lastTransitionTime":"2025-10-07T13:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.093418 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.095043 4717 scope.go:117] "RemoveContainer" containerID="597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd" Oct 07 13:54:18 crc kubenswrapper[4717]: E1007 13:54:18.095418 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.172700 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.172769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.172783 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.172806 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.172832 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:18Z","lastTransitionTime":"2025-10-07T13:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.275660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.275715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.275727 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.275753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.275765 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:18Z","lastTransitionTime":"2025-10-07T13:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.378341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.378425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.378443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.378470 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.378488 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:18Z","lastTransitionTime":"2025-10-07T13:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.482326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.482388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.482409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.482437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.482457 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:18Z","lastTransitionTime":"2025-10-07T13:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.585117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.585183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.585196 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.585218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.585236 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:18Z","lastTransitionTime":"2025-10-07T13:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.688102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.688145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.688154 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.688171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.688185 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:18Z","lastTransitionTime":"2025-10-07T13:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.790819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.790874 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.790886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.790906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.790922 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:18Z","lastTransitionTime":"2025-10-07T13:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.868368 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.868368 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:18 crc kubenswrapper[4717]: E1007 13:54:18.868524 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.868396 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:18 crc kubenswrapper[4717]: E1007 13:54:18.868599 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:18 crc kubenswrapper[4717]: E1007 13:54:18.868816 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.883405 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.893949 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.894269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.894355 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.894425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.894498 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:18Z","lastTransitionTime":"2025-10-07T13:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.899030 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.913625 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.926977 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.939580 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.954297 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.974246 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.993403 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"ds:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:54:09.914704 6128 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:54:09.914744 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.998692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.998746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.998756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.998777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:18 crc kubenswrapper[4717]: I1007 13:54:18.998787 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:18Z","lastTransitionTime":"2025-10-07T13:54:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.007030 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.020259 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.033759 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.047782 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.062790 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.077926 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.102050 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.102126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.102138 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.102153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.102184 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:19Z","lastTransitionTime":"2025-10-07T13:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.103126 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.118331 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.134787 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.204627 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.204686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.204699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.204718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.204732 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:19Z","lastTransitionTime":"2025-10-07T13:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.307617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.307675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.307688 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.307707 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.307721 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:19Z","lastTransitionTime":"2025-10-07T13:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.410252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.410299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.410313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.410337 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.410351 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:19Z","lastTransitionTime":"2025-10-07T13:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.512988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.513386 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.513453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.513575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.513653 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:19Z","lastTransitionTime":"2025-10-07T13:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.616415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.616455 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.616465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.616485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.616504 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:19Z","lastTransitionTime":"2025-10-07T13:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.719510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.719567 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.719585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.719608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.719621 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:19Z","lastTransitionTime":"2025-10-07T13:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.822438 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.822493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.822505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.822521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.822533 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:19Z","lastTransitionTime":"2025-10-07T13:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.867747 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:19 crc kubenswrapper[4717]: E1007 13:54:19.867973 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.925763 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.926100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.926167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.926281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:19 crc kubenswrapper[4717]: I1007 13:54:19.926352 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:19Z","lastTransitionTime":"2025-10-07T13:54:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.029419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.029488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.029504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.029526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.029540 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:20Z","lastTransitionTime":"2025-10-07T13:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.132852 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.132916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.132932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.132957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.132970 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:20Z","lastTransitionTime":"2025-10-07T13:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.237615 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.237681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.237700 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.237728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.237748 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:20Z","lastTransitionTime":"2025-10-07T13:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.340952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.341167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.341183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.341206 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.341222 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:20Z","lastTransitionTime":"2025-10-07T13:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.443875 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.443926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.443942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.443968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.443985 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:20Z","lastTransitionTime":"2025-10-07T13:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.547054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.547107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.547126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.547150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.547164 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:20Z","lastTransitionTime":"2025-10-07T13:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.649629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.649728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.649744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.649770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.649822 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:20Z","lastTransitionTime":"2025-10-07T13:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.752976 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.753076 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.753103 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.753148 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.753174 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:20Z","lastTransitionTime":"2025-10-07T13:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.855795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.855847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.855860 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.855883 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.855896 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:20Z","lastTransitionTime":"2025-10-07T13:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.867378 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.867398 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.867554 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:20 crc kubenswrapper[4717]: E1007 13:54:20.867765 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:20 crc kubenswrapper[4717]: E1007 13:54:20.867830 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:20 crc kubenswrapper[4717]: E1007 13:54:20.868059 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.958961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.959060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.959075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.959097 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:20 crc kubenswrapper[4717]: I1007 13:54:20.959110 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:20Z","lastTransitionTime":"2025-10-07T13:54:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.062533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.062578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.062587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.062607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.062643 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:21Z","lastTransitionTime":"2025-10-07T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.165644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.165704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.165718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.165736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.165749 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:21Z","lastTransitionTime":"2025-10-07T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.269487 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.269551 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.269566 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.269586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.269603 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:21Z","lastTransitionTime":"2025-10-07T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.372190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.372244 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.372255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.372273 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.372286 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:21Z","lastTransitionTime":"2025-10-07T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.475103 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.475157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.475172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.475194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.475209 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:21Z","lastTransitionTime":"2025-10-07T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.577956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.578040 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.578056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.578077 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.578092 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:21Z","lastTransitionTime":"2025-10-07T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.681538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.681591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.681602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.681624 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.681635 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:21Z","lastTransitionTime":"2025-10-07T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.784621 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.784675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.784685 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.784703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.784713 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:21Z","lastTransitionTime":"2025-10-07T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.867762 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:21 crc kubenswrapper[4717]: E1007 13:54:21.867926 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.882292 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:21 crc kubenswrapper[4717]: E1007 13:54:21.882516 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:21 crc kubenswrapper[4717]: E1007 13:54:21.882625 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs podName:004bf989-60a1-4a45-bb4d-fc6a41829f3d nodeName:}" failed. No retries permitted until 2025-10-07 13:54:29.882599275 +0000 UTC m=+51.710525067 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs") pod "network-metrics-daemon-vl8rk" (UID: "004bf989-60a1-4a45-bb4d-fc6a41829f3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.886949 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.886985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.886999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.887041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.887055 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:21Z","lastTransitionTime":"2025-10-07T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.990056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.990111 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.990122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.990144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:21 crc kubenswrapper[4717]: I1007 13:54:21.990157 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:21Z","lastTransitionTime":"2025-10-07T13:54:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.092910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.092962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.092974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.092992 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.093026 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:22Z","lastTransitionTime":"2025-10-07T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.198291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.198429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.198452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.198481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.198500 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:22Z","lastTransitionTime":"2025-10-07T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.301325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.301381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.301396 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.301417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.301428 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:22Z","lastTransitionTime":"2025-10-07T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.403892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.403938 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.403950 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.403966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.403977 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:22Z","lastTransitionTime":"2025-10-07T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.506809 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.506856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.506871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.506902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.506923 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:22Z","lastTransitionTime":"2025-10-07T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.610215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.610273 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.610285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.610301 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.610315 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:22Z","lastTransitionTime":"2025-10-07T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.713171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.713235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.713251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.713271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.713286 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:22Z","lastTransitionTime":"2025-10-07T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.816465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.816509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.816520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.816538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.816552 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:22Z","lastTransitionTime":"2025-10-07T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.868352 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.868411 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.868568 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:22 crc kubenswrapper[4717]: E1007 13:54:22.868697 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:22 crc kubenswrapper[4717]: E1007 13:54:22.868806 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:22 crc kubenswrapper[4717]: E1007 13:54:22.868983 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.920061 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.920126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.920138 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.920166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:22 crc kubenswrapper[4717]: I1007 13:54:22.920184 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:22Z","lastTransitionTime":"2025-10-07T13:54:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.023646 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.023703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.023712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.023730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.023744 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:23Z","lastTransitionTime":"2025-10-07T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.126811 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.126900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.126925 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.126958 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.126985 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:23Z","lastTransitionTime":"2025-10-07T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.229958 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.230143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.230160 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.230183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.230199 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:23Z","lastTransitionTime":"2025-10-07T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.332914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.332959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.332974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.332991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.333001 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:23Z","lastTransitionTime":"2025-10-07T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.436275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.436347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.436363 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.436387 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.436402 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:23Z","lastTransitionTime":"2025-10-07T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.539148 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.539203 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.539213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.539232 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.539244 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:23Z","lastTransitionTime":"2025-10-07T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.642458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.642508 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.642518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.642538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.642572 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:23Z","lastTransitionTime":"2025-10-07T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.745645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.745717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.745733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.745760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.745782 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:23Z","lastTransitionTime":"2025-10-07T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.849814 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.849884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.849900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.849923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.849942 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:23Z","lastTransitionTime":"2025-10-07T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.868312 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:23 crc kubenswrapper[4717]: E1007 13:54:23.868595 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.953204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.953261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.953276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.953297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:23 crc kubenswrapper[4717]: I1007 13:54:23.953310 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:23Z","lastTransitionTime":"2025-10-07T13:54:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.057149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.057214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.057224 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.057245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.057256 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:24Z","lastTransitionTime":"2025-10-07T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.159683 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.159747 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.159758 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.159779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.159791 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:24Z","lastTransitionTime":"2025-10-07T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.263451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.263494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.263505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.263523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.263536 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:24Z","lastTransitionTime":"2025-10-07T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.365518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.365568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.365578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.365596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.365611 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:24Z","lastTransitionTime":"2025-10-07T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.469149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.469204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.469214 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.469233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.469248 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:24Z","lastTransitionTime":"2025-10-07T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.571926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.571974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.571983 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.571999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.572028 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:24Z","lastTransitionTime":"2025-10-07T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.675023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.675071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.675080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.675098 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.675107 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:24Z","lastTransitionTime":"2025-10-07T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.777795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.777850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.777863 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.777882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.777897 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:24Z","lastTransitionTime":"2025-10-07T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.868074 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.868181 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:24 crc kubenswrapper[4717]: E1007 13:54:24.868258 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:24 crc kubenswrapper[4717]: E1007 13:54:24.868354 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.868352 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:24 crc kubenswrapper[4717]: E1007 13:54:24.868456 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.881033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.881093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.881106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.881127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.881141 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:24Z","lastTransitionTime":"2025-10-07T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.984217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.984264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.984275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.984291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:24 crc kubenswrapper[4717]: I1007 13:54:24.984302 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:24Z","lastTransitionTime":"2025-10-07T13:54:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.086690 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.086731 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.086740 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.086755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.086766 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.190199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.190501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.190594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.190731 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.190825 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.293390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.293850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.293942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.294243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.294350 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.397623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.397936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.398112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.398250 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.398347 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.500879 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.500928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.500941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.500961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.500974 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.604031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.604094 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.604104 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.604121 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.604137 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.706967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.707076 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.707090 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.707109 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.707119 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.714443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.714507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.714525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.714548 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.714561 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: E1007 13:54:25.729061 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.734052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.734124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.734157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.734189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.734207 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: E1007 13:54:25.749021 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.753172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.753223 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.753233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.753254 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.753270 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: E1007 13:54:25.766979 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.771552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.771603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.771617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.771641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.771654 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: E1007 13:54:25.784733 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.789466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.789523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.789537 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.789557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.789571 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: E1007 13:54:25.803078 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:25Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:25 crc kubenswrapper[4717]: E1007 13:54:25.803218 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.809862 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.809914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.809924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.809946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.809958 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.868275 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:25 crc kubenswrapper[4717]: E1007 13:54:25.868498 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.912860 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.912924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.912935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.912952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:25 crc kubenswrapper[4717]: I1007 13:54:25.912963 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:25Z","lastTransitionTime":"2025-10-07T13:54:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.016039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.016087 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.016098 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.016126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.016138 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:26Z","lastTransitionTime":"2025-10-07T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.118935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.118990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.119002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.119038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.119053 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:26Z","lastTransitionTime":"2025-10-07T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.222482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.222548 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.222562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.222585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.222599 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:26Z","lastTransitionTime":"2025-10-07T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.325862 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.325957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.325978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.326039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.326060 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:26Z","lastTransitionTime":"2025-10-07T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.409888 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.418677 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.430182 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.430261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.430287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.430317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.430342 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:26Z","lastTransitionTime":"2025-10-07T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.433408 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"ds:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:54:09.914704 6128 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:54:09.914744 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.447701 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.463284 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.476840 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.494581 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.512567 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.529908 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.533489 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.533546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.533568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.533594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.533611 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:26Z","lastTransitionTime":"2025-10-07T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.545091 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.565596 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.580899 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.593035 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.605393 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.619963 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.635300 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.636212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.636251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.636268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.636294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.636312 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:26Z","lastTransitionTime":"2025-10-07T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.658454 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.677127 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.691216 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:26Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.739696 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.739743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.739756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.739774 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.739786 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:26Z","lastTransitionTime":"2025-10-07T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.842796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.842857 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.842875 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.842896 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.842911 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:26Z","lastTransitionTime":"2025-10-07T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.868236 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.868360 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.868236 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:26 crc kubenswrapper[4717]: E1007 13:54:26.868483 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:26 crc kubenswrapper[4717]: E1007 13:54:26.868732 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:26 crc kubenswrapper[4717]: E1007 13:54:26.868739 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.946041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.946090 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.946101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.946120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:26 crc kubenswrapper[4717]: I1007 13:54:26.946134 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:26Z","lastTransitionTime":"2025-10-07T13:54:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.048364 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.048771 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.048922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.049152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.049304 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:27Z","lastTransitionTime":"2025-10-07T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.151699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.152034 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.152129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.152221 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.152458 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:27Z","lastTransitionTime":"2025-10-07T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.255084 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.255129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.255140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.255159 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.255170 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:27Z","lastTransitionTime":"2025-10-07T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.358191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.358236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.358256 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.358276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.358288 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:27Z","lastTransitionTime":"2025-10-07T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.461600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.461662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.461677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.461696 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.461707 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:27Z","lastTransitionTime":"2025-10-07T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.565088 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.565151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.565164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.565184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.565195 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:27Z","lastTransitionTime":"2025-10-07T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.667635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.667702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.667716 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.667734 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.667746 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:27Z","lastTransitionTime":"2025-10-07T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.770666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.770731 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.770750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.770772 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.770787 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:27Z","lastTransitionTime":"2025-10-07T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.868193 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:27 crc kubenswrapper[4717]: E1007 13:54:27.868463 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.873538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.873588 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.873645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.873666 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.873678 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:27Z","lastTransitionTime":"2025-10-07T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.976155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.976225 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.976238 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.976256 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:27 crc kubenswrapper[4717]: I1007 13:54:27.976286 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:27Z","lastTransitionTime":"2025-10-07T13:54:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.079486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.079533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.079543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.079560 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.079569 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:28Z","lastTransitionTime":"2025-10-07T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.182456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.182535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.182556 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.182586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.182600 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:28Z","lastTransitionTime":"2025-10-07T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.286948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.287393 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.287410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.287435 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.287451 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:28Z","lastTransitionTime":"2025-10-07T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.390834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.390917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.390937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.390967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.390987 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:28Z","lastTransitionTime":"2025-10-07T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.494290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.494348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.494362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.494384 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.494398 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:28Z","lastTransitionTime":"2025-10-07T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.597158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.597220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.597238 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.597265 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.597284 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:28Z","lastTransitionTime":"2025-10-07T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.700313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.700383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.700393 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.700412 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.700425 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:28Z","lastTransitionTime":"2025-10-07T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.803631 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.803733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.803753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.803776 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.803795 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:28Z","lastTransitionTime":"2025-10-07T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.867936 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.867945 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.868712 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:28 crc kubenswrapper[4717]: E1007 13:54:28.868876 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:28 crc kubenswrapper[4717]: E1007 13:54:28.869057 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:28 crc kubenswrapper[4717]: E1007 13:54:28.869165 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.888148 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.903495 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.906462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.906493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.906505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.906524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.906538 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:28Z","lastTransitionTime":"2025-10-07T13:54:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.918702 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.932883 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.953841 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"ds:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:54:09.914704 6128 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:54:09.914744 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.972970 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:28 crc kubenswrapper[4717]: I1007 13:54:28.983964 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.002716 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.009026 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.009065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.009076 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.009102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.009131 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:29Z","lastTransitionTime":"2025-10-07T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.017551 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.030828 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.044462 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.058110 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.070436 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.084680 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.093949 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.108408 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.111106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.111150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.111164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.111184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.111195 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:29Z","lastTransitionTime":"2025-10-07T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.122860 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.139998 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.213059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.213110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.213124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.213143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.213157 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:29Z","lastTransitionTime":"2025-10-07T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.316316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.316399 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.316417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.316444 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.316471 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:29Z","lastTransitionTime":"2025-10-07T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.419899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.419967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.419979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.419998 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.420031 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:29Z","lastTransitionTime":"2025-10-07T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.523230 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.523288 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.523297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.523317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.523329 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:29Z","lastTransitionTime":"2025-10-07T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.625856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.625917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.625932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.625952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.625972 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:29Z","lastTransitionTime":"2025-10-07T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.728642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.728704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.728714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.728738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.728749 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:29Z","lastTransitionTime":"2025-10-07T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.832158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.832227 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.832264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.832295 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.832317 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:29Z","lastTransitionTime":"2025-10-07T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.867736 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:29 crc kubenswrapper[4717]: E1007 13:54:29.867930 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.934446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.934499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.934514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.934530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.934541 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:29Z","lastTransitionTime":"2025-10-07T13:54:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:29 crc kubenswrapper[4717]: I1007 13:54:29.967266 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:29 crc kubenswrapper[4717]: E1007 13:54:29.967489 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:29 crc kubenswrapper[4717]: E1007 13:54:29.967569 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs podName:004bf989-60a1-4a45-bb4d-fc6a41829f3d nodeName:}" failed. No retries permitted until 2025-10-07 13:54:45.967547729 +0000 UTC m=+67.795473521 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs") pod "network-metrics-daemon-vl8rk" (UID: "004bf989-60a1-4a45-bb4d-fc6a41829f3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.037195 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.037256 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.037266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.037284 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.037294 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:30Z","lastTransitionTime":"2025-10-07T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.140193 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.140256 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.140268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.140287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.140301 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:30Z","lastTransitionTime":"2025-10-07T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.242797 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.242863 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.242874 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.242893 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.242906 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:30Z","lastTransitionTime":"2025-10-07T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.346043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.346105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.346116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.346136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.346150 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:30Z","lastTransitionTime":"2025-10-07T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.449024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.449085 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.449102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.449124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.449138 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:30Z","lastTransitionTime":"2025-10-07T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.551970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.552062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.552076 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.552103 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.552120 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:30Z","lastTransitionTime":"2025-10-07T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.654478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.654524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.654534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.654568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.654580 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:30Z","lastTransitionTime":"2025-10-07T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.758572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.758642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.758656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.758677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.758690 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:30Z","lastTransitionTime":"2025-10-07T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.773132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.773291 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.773321 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.773452 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:55:02.773396698 +0000 UTC m=+84.601322510 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.773463 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.773560 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:55:02.773549762 +0000 UTC m=+84.601475634 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.773562 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.773703 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.773736 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.773814 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.773844 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.773785 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:55:02.773754737 +0000 UTC m=+84.601680529 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.773962 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:55:02.773925481 +0000 UTC m=+84.601851453 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.862134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.862218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.862243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.862271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.862286 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:30Z","lastTransitionTime":"2025-10-07T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.867704 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.867972 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.868061 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.868069 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.868567 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.868698 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.868892 4717 scope.go:117] "RemoveContainer" containerID="597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.874336 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.874507 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.874533 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.874549 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:30 crc kubenswrapper[4717]: E1007 13:54:30.874604 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:55:02.874581639 +0000 UTC m=+84.702507441 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.965865 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.965921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.965932 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.965952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:30 crc kubenswrapper[4717]: I1007 13:54:30.965963 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:30Z","lastTransitionTime":"2025-10-07T13:54:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.069194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.069252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.069264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.069282 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.069293 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:31Z","lastTransitionTime":"2025-10-07T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.172421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.172487 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.172502 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.172521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.172534 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:31Z","lastTransitionTime":"2025-10-07T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.232484 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/1.log" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.240994 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.241471 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.258266 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.276818 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.276875 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.276885 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.276905 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.276917 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:31Z","lastTransitionTime":"2025-10-07T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.280582 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.298385 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.311166 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.330973 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.344529 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.364366 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.380058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.380100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.380115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.380136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.380159 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:31Z","lastTransitionTime":"2025-10-07T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.397179 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.429849 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"ds:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:54:09.914704 6128 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:54:09.914744 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.451770 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.474909 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.482281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.482323 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.482335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.482357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.482369 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:31Z","lastTransitionTime":"2025-10-07T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.490283 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.505636 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.520960 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.534495 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.547027 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.561339 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.574890 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:31Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.585569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.585636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.585649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.585672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.585687 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:31Z","lastTransitionTime":"2025-10-07T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.688186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.688237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.688247 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.688267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.688278 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:31Z","lastTransitionTime":"2025-10-07T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.791365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.791821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.791839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.791863 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.791882 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:31Z","lastTransitionTime":"2025-10-07T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.867809 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:31 crc kubenswrapper[4717]: E1007 13:54:31.867974 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.894097 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.894141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.894153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.894170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.894181 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:31Z","lastTransitionTime":"2025-10-07T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.997765 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.997841 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.997859 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.997888 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:31 crc kubenswrapper[4717]: I1007 13:54:31.997903 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:31Z","lastTransitionTime":"2025-10-07T13:54:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.100805 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.100874 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.100889 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.100907 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.100919 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:32Z","lastTransitionTime":"2025-10-07T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.203058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.203100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.203110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.203127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.203183 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:32Z","lastTransitionTime":"2025-10-07T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.247040 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/2.log" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.247811 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/1.log" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.251509 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb" exitCode=1 Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.251567 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb"} Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.251619 4717 scope.go:117] "RemoveContainer" containerID="597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.253193 4717 scope.go:117] "RemoveContainer" containerID="3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb" Oct 07 13:54:32 crc kubenswrapper[4717]: E1007 13:54:32.253537 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.273395 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.306313 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://597673f454477b76895429605e3b40d18de052703116170b37661cf5c0ba3ddd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:09Z\\\",\\\"message\\\":\\\"ds:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1007 13:54:09.914704 6128 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-ingress-operator/metrics]} name:Service_openshift-ingress-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.244:9393:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {d8772e82-b0a4-4596-87d3-3d517c13344b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1007 13:54:09.914744 6128 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:31Z\\\",\\\"message\\\":\\\" local for Pod openshift-ovn-kubernetes/ovnkube-node-lx6tg in node crc\\\\nI1007 13:54:31.812296 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812303 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812309 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812310 6389 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v in node crc\\\\nI1007 13:54:31.812311 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812313 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-2f4zj\\\\nI1007 13:54:31.812322 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v after 0 failed attempt(s)\\\\nI1007 13:54:31.812325 6389 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1007 13:54:31.812282 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1007 13:54:31.812336 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1007 13:54:31.812348 6389 ovn.go:134] Ensur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.306728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.306762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.306771 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.306790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.306801 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:32Z","lastTransitionTime":"2025-10-07T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.321859 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.338687 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.354110 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.372534 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.390345 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.403605 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.409576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.409622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.409636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.409656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.409674 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:32Z","lastTransitionTime":"2025-10-07T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.418090 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.439927 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.457377 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.471837 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.487050 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.500256 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.512152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.512207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.512226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.512247 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.512264 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:32Z","lastTransitionTime":"2025-10-07T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.517729 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.530892 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.542877 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.557276 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:32Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.615034 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.615078 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.615089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.615106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.615117 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:32Z","lastTransitionTime":"2025-10-07T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.718039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.718113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.718129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.718150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.718161 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:32Z","lastTransitionTime":"2025-10-07T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.821114 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.821166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.821175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.821193 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.821203 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:32Z","lastTransitionTime":"2025-10-07T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.867982 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.868133 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:32 crc kubenswrapper[4717]: E1007 13:54:32.868180 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:32 crc kubenswrapper[4717]: E1007 13:54:32.868309 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.868372 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:32 crc kubenswrapper[4717]: E1007 13:54:32.868624 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.924794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.924863 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.924882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.924908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:32 crc kubenswrapper[4717]: I1007 13:54:32.924926 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:32Z","lastTransitionTime":"2025-10-07T13:54:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.027998 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.028162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.028192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.028263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.028289 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:33Z","lastTransitionTime":"2025-10-07T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.132355 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.132434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.132449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.132474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.132490 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:33Z","lastTransitionTime":"2025-10-07T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.235147 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.235192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.235202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.235219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.235230 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:33Z","lastTransitionTime":"2025-10-07T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.258641 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/2.log" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.263163 4717 scope.go:117] "RemoveContainer" containerID="3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb" Oct 07 13:54:33 crc kubenswrapper[4717]: E1007 13:54:33.263341 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.285244 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.301256 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.316293 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.328731 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.338212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.338260 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.338273 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.338294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.338307 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:33Z","lastTransitionTime":"2025-10-07T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.339935 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.350904 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.363572 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.386142 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:31Z\\\",\\\"message\\\":\\\" local for Pod openshift-ovn-kubernetes/ovnkube-node-lx6tg in node crc\\\\nI1007 13:54:31.812296 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812303 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812309 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812310 6389 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v in node crc\\\\nI1007 13:54:31.812311 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812313 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-2f4zj\\\\nI1007 13:54:31.812322 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v after 0 failed attempt(s)\\\\nI1007 13:54:31.812325 6389 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1007 13:54:31.812282 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1007 13:54:31.812336 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1007 13:54:31.812348 6389 ovn.go:134] Ensur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.398714 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.410991 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.424546 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.437073 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.443664 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.443716 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.443725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.443742 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.443753 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:33Z","lastTransitionTime":"2025-10-07T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.456906 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.467694 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.491642 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.514193 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.526985 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.541286 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:33Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.546557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.546600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.546611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.546630 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.546645 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:33Z","lastTransitionTime":"2025-10-07T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.649109 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.649175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.649187 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.649205 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.649219 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:33Z","lastTransitionTime":"2025-10-07T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.752331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.752399 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.752417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.752440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.752459 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:33Z","lastTransitionTime":"2025-10-07T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.855721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.855769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.855780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.855801 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.855814 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:33Z","lastTransitionTime":"2025-10-07T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.868173 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:33 crc kubenswrapper[4717]: E1007 13:54:33.868375 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.958972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.959054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.959070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.959092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:33 crc kubenswrapper[4717]: I1007 13:54:33.959104 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:33Z","lastTransitionTime":"2025-10-07T13:54:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.062128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.062191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.062211 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.062236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.062248 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:34Z","lastTransitionTime":"2025-10-07T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.165658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.165728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.165743 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.165767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.165787 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:34Z","lastTransitionTime":"2025-10-07T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.268113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.268158 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.268168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.268187 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.268201 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:34Z","lastTransitionTime":"2025-10-07T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.373094 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.373135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.373144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.373163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.373180 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:34Z","lastTransitionTime":"2025-10-07T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.476237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.476305 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.476319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.476339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.476354 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:34Z","lastTransitionTime":"2025-10-07T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.579406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.579462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.579472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.579494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.579505 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:34Z","lastTransitionTime":"2025-10-07T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.683153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.683204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.683217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.683236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.683245 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:34Z","lastTransitionTime":"2025-10-07T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.786125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.786168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.786178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.786198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.786213 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:34Z","lastTransitionTime":"2025-10-07T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.868272 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.868319 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.868392 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:34 crc kubenswrapper[4717]: E1007 13:54:34.868480 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:34 crc kubenswrapper[4717]: E1007 13:54:34.868600 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:34 crc kubenswrapper[4717]: E1007 13:54:34.868733 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.889322 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.889391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.889410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.889432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.889447 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:34Z","lastTransitionTime":"2025-10-07T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.992154 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.992211 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.992222 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.992240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:34 crc kubenswrapper[4717]: I1007 13:54:34.992254 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:34Z","lastTransitionTime":"2025-10-07T13:54:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.095189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.095250 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.095264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.095283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.095294 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:35Z","lastTransitionTime":"2025-10-07T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.197735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.197783 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.197794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.197813 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.197825 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:35Z","lastTransitionTime":"2025-10-07T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.300659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.300713 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.300725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.300746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.300761 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:35Z","lastTransitionTime":"2025-10-07T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.403374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.403425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.403436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.403454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.403466 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:35Z","lastTransitionTime":"2025-10-07T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.506028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.506073 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.506094 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.506118 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.506131 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:35Z","lastTransitionTime":"2025-10-07T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.609877 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.610465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.610482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.610501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.610511 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:35Z","lastTransitionTime":"2025-10-07T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.717784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.717870 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.717904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.717935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.717954 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:35Z","lastTransitionTime":"2025-10-07T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.821223 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.821276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.821287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.821306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.821506 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:35Z","lastTransitionTime":"2025-10-07T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.868314 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:35 crc kubenswrapper[4717]: E1007 13:54:35.868512 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.923744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.923785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.923799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.923823 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:35 crc kubenswrapper[4717]: I1007 13:54:35.923838 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:35Z","lastTransitionTime":"2025-10-07T13:54:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.027037 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.027096 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.027113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.027135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.027148 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.108906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.108956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.108966 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.108985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.108995 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: E1007 13:54:36.126585 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.131899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.131955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.131968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.131990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.132031 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: E1007 13:54:36.145973 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.150229 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.150282 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.150292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.150313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.150328 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: E1007 13:54:36.166290 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.170587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.170627 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.170676 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.170698 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.170711 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: E1007 13:54:36.184209 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.188587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.188622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.188633 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.188651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.188661 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: E1007 13:54:36.202876 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:36Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:36 crc kubenswrapper[4717]: E1007 13:54:36.203193 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.205140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.205198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.205212 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.205239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.205254 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.307686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.307735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.307745 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.307766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.307776 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.411843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.411927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.411942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.411969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.411989 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.515215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.515279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.515292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.515316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.515332 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.618465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.618504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.618514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.618530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.618541 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.721735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.721786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.721800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.721821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.721834 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.825586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.825628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.825638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.825654 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.825666 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.868053 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.868559 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.868663 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:36 crc kubenswrapper[4717]: E1007 13:54:36.868853 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:36 crc kubenswrapper[4717]: E1007 13:54:36.869000 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:36 crc kubenswrapper[4717]: E1007 13:54:36.869149 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.933030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.933084 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.933098 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.933117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:36 crc kubenswrapper[4717]: I1007 13:54:36.933129 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:36Z","lastTransitionTime":"2025-10-07T13:54:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.035447 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.035490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.035498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.035513 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.035522 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:37Z","lastTransitionTime":"2025-10-07T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.137553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.137602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.137613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.137630 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.137641 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:37Z","lastTransitionTime":"2025-10-07T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.240578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.240622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.240633 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.240648 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.240683 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:37Z","lastTransitionTime":"2025-10-07T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.344252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.344286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.344297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.344312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.344324 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:37Z","lastTransitionTime":"2025-10-07T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.447124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.447193 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.447216 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.447246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.447266 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:37Z","lastTransitionTime":"2025-10-07T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.549505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.549549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.549558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.549569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.549578 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:37Z","lastTransitionTime":"2025-10-07T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.652272 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.652302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.652312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.652324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.652333 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:37Z","lastTransitionTime":"2025-10-07T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.754785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.754822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.754830 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.754843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.754851 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:37Z","lastTransitionTime":"2025-10-07T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.857528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.857746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.857771 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.857799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.857822 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:37Z","lastTransitionTime":"2025-10-07T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.867930 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:37 crc kubenswrapper[4717]: E1007 13:54:37.868053 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.959909 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.959948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.959957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.959969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:37 crc kubenswrapper[4717]: I1007 13:54:37.959978 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:37Z","lastTransitionTime":"2025-10-07T13:54:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.062299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.062367 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.062389 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.062421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.062440 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:38Z","lastTransitionTime":"2025-10-07T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.164140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.164186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.164197 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.164210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.164219 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:38Z","lastTransitionTime":"2025-10-07T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.266820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.267201 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.267324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.267405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.267482 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:38Z","lastTransitionTime":"2025-10-07T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.369769 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.370040 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.370117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.370183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.370238 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:38Z","lastTransitionTime":"2025-10-07T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.473202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.473234 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.473243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.473256 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.473266 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:38Z","lastTransitionTime":"2025-10-07T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.576437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.576527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.576546 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.576574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.576594 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:38Z","lastTransitionTime":"2025-10-07T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.680470 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.680568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.680581 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.680597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.680607 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:38Z","lastTransitionTime":"2025-10-07T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.783370 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.783403 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.783412 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.783426 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.783438 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:38Z","lastTransitionTime":"2025-10-07T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.868143 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.868166 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:38 crc kubenswrapper[4717]: E1007 13:54:38.868337 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.868368 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:38 crc kubenswrapper[4717]: E1007 13:54:38.868671 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:38 crc kubenswrapper[4717]: E1007 13:54:38.868745 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.886115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.886157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.886170 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.886186 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.886198 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:38Z","lastTransitionTime":"2025-10-07T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.896418 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.912942 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.928126 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.942532 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.954844 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.969567 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.988810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.988878 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.988894 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.988915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.988929 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:38Z","lastTransitionTime":"2025-10-07T13:54:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:38 crc kubenswrapper[4717]: I1007 13:54:38.991146 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:38Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.004809 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.017843 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.029708 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.043124 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.055718 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.067151 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.080776 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.091141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.091179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.091188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.091204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.091214 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:39Z","lastTransitionTime":"2025-10-07T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.098220 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.120495 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.147889 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:31Z\\\",\\\"message\\\":\\\" local for Pod openshift-ovn-kubernetes/ovnkube-node-lx6tg in node crc\\\\nI1007 13:54:31.812296 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812303 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812309 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812310 6389 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v in node crc\\\\nI1007 13:54:31.812311 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812313 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-2f4zj\\\\nI1007 13:54:31.812322 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v after 0 failed attempt(s)\\\\nI1007 13:54:31.812325 6389 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1007 13:54:31.812282 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1007 13:54:31.812336 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1007 13:54:31.812348 6389 ovn.go:134] Ensur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.161881 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:39Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.193652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.193684 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.193692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.193706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.193714 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:39Z","lastTransitionTime":"2025-10-07T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.295843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.295887 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.295896 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.295911 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.295919 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:39Z","lastTransitionTime":"2025-10-07T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.397705 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.397749 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.397761 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.397777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.397789 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:39Z","lastTransitionTime":"2025-10-07T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.501162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.501224 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.501251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.501275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.501291 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:39Z","lastTransitionTime":"2025-10-07T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.603917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.603979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.603989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.604024 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.604039 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:39Z","lastTransitionTime":"2025-10-07T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.707335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.707608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.707699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.707824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.707906 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:39Z","lastTransitionTime":"2025-10-07T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.810093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.810152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.810162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.810174 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.810182 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:39Z","lastTransitionTime":"2025-10-07T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.868067 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:39 crc kubenswrapper[4717]: E1007 13:54:39.868194 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.912217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.912532 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.912642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.912737 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:39 crc kubenswrapper[4717]: I1007 13:54:39.912834 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:39Z","lastTransitionTime":"2025-10-07T13:54:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.015553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.015898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.015992 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.016144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.016224 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:40Z","lastTransitionTime":"2025-10-07T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.118849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.118898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.118917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.118935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.118948 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:40Z","lastTransitionTime":"2025-10-07T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.221499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.221781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.221865 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.221994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.222109 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:40Z","lastTransitionTime":"2025-10-07T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.323951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.324025 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.324036 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.324051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.324060 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:40Z","lastTransitionTime":"2025-10-07T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.426629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.426751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.426762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.426783 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.426795 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:40Z","lastTransitionTime":"2025-10-07T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.529725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.529814 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.529832 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.529853 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.529865 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:40Z","lastTransitionTime":"2025-10-07T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.632605 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.632660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.632669 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.632683 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.632693 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:40Z","lastTransitionTime":"2025-10-07T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.735052 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.735101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.735113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.735130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.735141 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:40Z","lastTransitionTime":"2025-10-07T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.837923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.837971 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.837984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.838000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.838031 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:40Z","lastTransitionTime":"2025-10-07T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.867560 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:40 crc kubenswrapper[4717]: E1007 13:54:40.867905 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.867608 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.867560 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:40 crc kubenswrapper[4717]: E1007 13:54:40.867987 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:40 crc kubenswrapper[4717]: E1007 13:54:40.868107 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.941472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.941628 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.941657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.941694 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:40 crc kubenswrapper[4717]: I1007 13:54:40.941725 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:40Z","lastTransitionTime":"2025-10-07T13:54:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.045552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.045608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.045619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.045639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.045654 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:41Z","lastTransitionTime":"2025-10-07T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.149055 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.149126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.149141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.149167 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.149181 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:41Z","lastTransitionTime":"2025-10-07T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.251644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.251683 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.251691 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.251703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.251712 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:41Z","lastTransitionTime":"2025-10-07T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.353452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.353495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.353504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.353518 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.353528 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:41Z","lastTransitionTime":"2025-10-07T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.456770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.456820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.456831 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.456850 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.456864 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:41Z","lastTransitionTime":"2025-10-07T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.559837 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.559900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.559919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.559945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.559969 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:41Z","lastTransitionTime":"2025-10-07T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.663457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.663504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.663514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.663530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.663540 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:41Z","lastTransitionTime":"2025-10-07T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.766359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.766394 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.766403 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.766417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.766428 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:41Z","lastTransitionTime":"2025-10-07T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.867291 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:41 crc kubenswrapper[4717]: E1007 13:54:41.867452 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.868984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.869094 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.869107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.869131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.869145 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:41Z","lastTransitionTime":"2025-10-07T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.971705 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.971755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.971766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.971789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:41 crc kubenswrapper[4717]: I1007 13:54:41.971801 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:41Z","lastTransitionTime":"2025-10-07T13:54:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.074885 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.074949 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.074962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.074981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.074996 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:42Z","lastTransitionTime":"2025-10-07T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.177596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.177647 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.177660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.177677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.177689 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:42Z","lastTransitionTime":"2025-10-07T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.280435 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.280475 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.280485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.280503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.280512 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:42Z","lastTransitionTime":"2025-10-07T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.382249 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.382312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.382326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.382343 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.382358 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:42Z","lastTransitionTime":"2025-10-07T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.484935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.484978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.484989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.485025 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.485045 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:42Z","lastTransitionTime":"2025-10-07T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.586583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.586617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.586626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.586639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.586648 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:42Z","lastTransitionTime":"2025-10-07T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.688127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.688174 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.688184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.688197 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.688206 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:42Z","lastTransitionTime":"2025-10-07T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.790312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.790371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.790390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.790416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.790433 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:42Z","lastTransitionTime":"2025-10-07T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.868072 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.868072 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:42 crc kubenswrapper[4717]: E1007 13:54:42.868239 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:42 crc kubenswrapper[4717]: E1007 13:54:42.868305 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.868096 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:42 crc kubenswrapper[4717]: E1007 13:54:42.868404 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.891972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.891994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.892028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.892041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.892050 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:42Z","lastTransitionTime":"2025-10-07T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.994699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.994737 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.994748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.994764 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:42 crc kubenswrapper[4717]: I1007 13:54:42.994775 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:42Z","lastTransitionTime":"2025-10-07T13:54:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.097436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.097477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.097491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.097505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.097518 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:43Z","lastTransitionTime":"2025-10-07T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.199939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.199973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.199981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.199996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.200031 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:43Z","lastTransitionTime":"2025-10-07T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.302064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.302140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.302152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.302168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.302199 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:43Z","lastTransitionTime":"2025-10-07T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.404596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.404630 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.404639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.404652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.404660 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:43Z","lastTransitionTime":"2025-10-07T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.507706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.507766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.507779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.507795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.507829 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:43Z","lastTransitionTime":"2025-10-07T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.614943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.614979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.614990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.615028 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.615040 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:43Z","lastTransitionTime":"2025-10-07T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.717421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.717492 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.717504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.717522 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.717534 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:43Z","lastTransitionTime":"2025-10-07T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.820223 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.820266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.820279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.820294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.820305 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:43Z","lastTransitionTime":"2025-10-07T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.867873 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:43 crc kubenswrapper[4717]: E1007 13:54:43.867996 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.923441 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.923503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.923525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.923548 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:43 crc kubenswrapper[4717]: I1007 13:54:43.923563 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:43Z","lastTransitionTime":"2025-10-07T13:54:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.026046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.026088 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.026100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.026117 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.026127 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:44Z","lastTransitionTime":"2025-10-07T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.128872 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.128926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.128944 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.128967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.128985 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:44Z","lastTransitionTime":"2025-10-07T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.232088 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.232137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.232152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.232172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.232187 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:44Z","lastTransitionTime":"2025-10-07T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.334385 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.334422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.334430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.334464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.334474 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:44Z","lastTransitionTime":"2025-10-07T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.436855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.436898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.436908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.436924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.436939 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:44Z","lastTransitionTime":"2025-10-07T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.538972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.539043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.539057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.539075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.539085 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:44Z","lastTransitionTime":"2025-10-07T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.640982 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.641056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.641070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.641089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.641099 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:44Z","lastTransitionTime":"2025-10-07T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.743308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.743351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.743360 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.743376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.743386 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:44Z","lastTransitionTime":"2025-10-07T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.845973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.846022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.846032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.846045 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.846056 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:44Z","lastTransitionTime":"2025-10-07T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.867521 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.867612 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.867608 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:44 crc kubenswrapper[4717]: E1007 13:54:44.867691 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:44 crc kubenswrapper[4717]: E1007 13:54:44.867752 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:44 crc kubenswrapper[4717]: E1007 13:54:44.867815 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.949260 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.949294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.949302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.949314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:44 crc kubenswrapper[4717]: I1007 13:54:44.949325 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:44Z","lastTransitionTime":"2025-10-07T13:54:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.052229 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.052284 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.052294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.052310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.052318 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:45Z","lastTransitionTime":"2025-10-07T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.154498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.154538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.154549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.154564 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.154573 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:45Z","lastTransitionTime":"2025-10-07T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.256902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.256941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.256952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.256967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.256977 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:45Z","lastTransitionTime":"2025-10-07T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.359455 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.359510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.359524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.359542 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.359552 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:45Z","lastTransitionTime":"2025-10-07T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.461881 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.461934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.461946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.461961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.461973 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:45Z","lastTransitionTime":"2025-10-07T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.563958 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.564023 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.564033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.564046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.564056 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:45Z","lastTransitionTime":"2025-10-07T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.666771 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.666819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.666829 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.666861 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.666875 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:45Z","lastTransitionTime":"2025-10-07T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.769141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.769190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.769198 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.769213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.769225 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:45Z","lastTransitionTime":"2025-10-07T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.868247 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:45 crc kubenswrapper[4717]: E1007 13:54:45.868434 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.873243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.873284 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.873298 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.873313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.873324 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:45Z","lastTransitionTime":"2025-10-07T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.976413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.976746 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.976847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.976955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:45 crc kubenswrapper[4717]: I1007 13:54:45.977278 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:45Z","lastTransitionTime":"2025-10-07T13:54:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.005507 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.005754 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.005898 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs podName:004bf989-60a1-4a45-bb4d-fc6a41829f3d nodeName:}" failed. No retries permitted until 2025-10-07 13:55:18.005872315 +0000 UTC m=+99.833798107 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs") pod "network-metrics-daemon-vl8rk" (UID: "004bf989-60a1-4a45-bb4d-fc6a41829f3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.079523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.080240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.080279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.080297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.080313 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.182656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.182690 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.182704 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.182717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.182727 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.284936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.284986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.285000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.285038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.285051 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.387046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.387101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.387111 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.387125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.387136 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.424545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.424596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.424604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.424621 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.424632 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.439333 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:46Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.442838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.442877 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.442886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.442902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.442911 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.455480 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:46Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.458588 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.458633 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.458663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.458680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.458688 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.472061 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:46Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.475376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.475420 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.475434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.475451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.475461 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.489035 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:46Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.492592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.492640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.492653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.492672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.492687 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.504578 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:46Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.504687 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.506202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.506240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.506252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.506267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.506278 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.608684 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.608725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.608739 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.608755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.608767 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.710795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.710826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.710834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.710847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.710855 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.813131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.813163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.813171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.813183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.813192 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.868056 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.868080 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.868169 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.868293 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.868419 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:46 crc kubenswrapper[4717]: E1007 13:54:46.868527 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.915843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.915886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.915900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.915918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:46 crc kubenswrapper[4717]: I1007 13:54:46.915932 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:46Z","lastTransitionTime":"2025-10-07T13:54:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.018381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.018423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.018436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.018454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.018498 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:47Z","lastTransitionTime":"2025-10-07T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.120476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.120509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.120520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.120535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.120545 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:47Z","lastTransitionTime":"2025-10-07T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.222446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.222486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.222496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.222510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.222519 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:47Z","lastTransitionTime":"2025-10-07T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.324692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.324739 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.324753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.324771 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.324783 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:47Z","lastTransitionTime":"2025-10-07T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.428076 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.428119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.428132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.428146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.428156 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:47Z","lastTransitionTime":"2025-10-07T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.530528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.530571 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.530583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.530602 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.530614 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:47Z","lastTransitionTime":"2025-10-07T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.633624 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.633672 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.633687 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.633711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.633729 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:47Z","lastTransitionTime":"2025-10-07T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.736404 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.736450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.736460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.736474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.736483 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:47Z","lastTransitionTime":"2025-10-07T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.839164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.839193 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.839201 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.839215 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.839223 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:47Z","lastTransitionTime":"2025-10-07T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.868116 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:47 crc kubenswrapper[4717]: E1007 13:54:47.868335 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.941601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.941658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.941671 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.941689 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:47 crc kubenswrapper[4717]: I1007 13:54:47.941701 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:47Z","lastTransitionTime":"2025-10-07T13:54:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.044323 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.044359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.044369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.044384 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.044396 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:48Z","lastTransitionTime":"2025-10-07T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.147025 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.147069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.147081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.147098 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.147110 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:48Z","lastTransitionTime":"2025-10-07T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.249516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.249549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.249558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.249573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.249583 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:48Z","lastTransitionTime":"2025-10-07T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.314410 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-shhlh_bf0d43cd-2fb1-490e-9de4-db923141bd43/kube-multus/0.log" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.314478 4717 generic.go:334] "Generic (PLEG): container finished" podID="bf0d43cd-2fb1-490e-9de4-db923141bd43" containerID="069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2" exitCode=1 Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.314523 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-shhlh" event={"ID":"bf0d43cd-2fb1-490e-9de4-db923141bd43","Type":"ContainerDied","Data":"069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.315967 4717 scope.go:117] "RemoveContainer" containerID="069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.332113 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.346445 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.351060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.351098 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.351108 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.351123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.351134 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:48Z","lastTransitionTime":"2025-10-07T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.360704 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.370292 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.382127 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.401418 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.412610 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.424727 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:47Z\\\",\\\"message\\\":\\\"2025-10-07T13:54:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19\\\\n2025-10-07T13:54:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19 to /host/opt/cni/bin/\\\\n2025-10-07T13:54:02Z [verbose] multus-daemon started\\\\n2025-10-07T13:54:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:54:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.438191 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.451813 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.453253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.453306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.453316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.453331 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.453343 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:48Z","lastTransitionTime":"2025-10-07T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.467949 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.481935 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.492098 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.503522 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.519361 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.542555 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:31Z\\\",\\\"message\\\":\\\" local for Pod openshift-ovn-kubernetes/ovnkube-node-lx6tg in node crc\\\\nI1007 13:54:31.812296 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812303 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812309 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812310 6389 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v in node crc\\\\nI1007 13:54:31.812311 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812313 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-2f4zj\\\\nI1007 13:54:31.812322 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v after 0 failed attempt(s)\\\\nI1007 13:54:31.812325 6389 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1007 13:54:31.812282 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1007 13:54:31.812336 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1007 13:54:31.812348 6389 ovn.go:134] Ensur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.555422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.555494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.555504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.555533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.555545 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:48Z","lastTransitionTime":"2025-10-07T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.556799 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.571613 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.657972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.658038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.658048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.658062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.658071 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:48Z","lastTransitionTime":"2025-10-07T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.760399 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.760434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.760462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.760476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.760484 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:48Z","lastTransitionTime":"2025-10-07T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.862623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.862931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.863081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.863192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.863292 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:48Z","lastTransitionTime":"2025-10-07T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.868027 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.868033 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:48 crc kubenswrapper[4717]: E1007 13:54:48.868175 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.868349 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:48 crc kubenswrapper[4717]: E1007 13:54:48.868935 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:48 crc kubenswrapper[4717]: E1007 13:54:48.868858 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.869201 4717 scope.go:117] "RemoveContainer" containerID="3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb" Oct 07 13:54:48 crc kubenswrapper[4717]: E1007 13:54:48.869488 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.883716 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.898045 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.912400 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.933925 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:31Z\\\",\\\"message\\\":\\\" local for Pod openshift-ovn-kubernetes/ovnkube-node-lx6tg in node crc\\\\nI1007 13:54:31.812296 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812303 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812309 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812310 6389 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v in node crc\\\\nI1007 13:54:31.812311 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812313 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-2f4zj\\\\nI1007 13:54:31.812322 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v after 0 failed attempt(s)\\\\nI1007 13:54:31.812325 6389 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1007 13:54:31.812282 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1007 13:54:31.812336 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1007 13:54:31.812348 6389 ovn.go:134] Ensur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.948591 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.957927 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.964915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.964940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.965332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.965348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.965358 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:48Z","lastTransitionTime":"2025-10-07T13:54:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.979460 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:48 crc kubenswrapper[4717]: I1007 13:54:48.992443 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:48Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.003404 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.014243 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.023590 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.034635 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:47Z\\\",\\\"message\\\":\\\"2025-10-07T13:54:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19\\\\n2025-10-07T13:54:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19 to /host/opt/cni/bin/\\\\n2025-10-07T13:54:02Z [verbose] multus-daemon started\\\\n2025-10-07T13:54:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:54:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.043307 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.053827 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.065896 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.067205 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.067240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.067251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.067268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.067279 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:49Z","lastTransitionTime":"2025-10-07T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.078482 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.088683 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.099677 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.169573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.169610 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.169619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.169632 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.169642 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:49Z","lastTransitionTime":"2025-10-07T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.271270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.271308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.271319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.271336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.271346 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:49Z","lastTransitionTime":"2025-10-07T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.319035 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-shhlh_bf0d43cd-2fb1-490e-9de4-db923141bd43/kube-multus/0.log" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.319095 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-shhlh" event={"ID":"bf0d43cd-2fb1-490e-9de4-db923141bd43","Type":"ContainerStarted","Data":"c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0"} Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.335120 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:47Z\\\",\\\"message\\\":\\\"2025-10-07T13:54:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19\\\\n2025-10-07T13:54:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19 to /host/opt/cni/bin/\\\\n2025-10-07T13:54:02Z [verbose] multus-daemon started\\\\n2025-10-07T13:54:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:54:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.348775 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.361478 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.373709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.373744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.373752 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.373788 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.373803 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:49Z","lastTransitionTime":"2025-10-07T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.374309 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.384374 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.400081 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.412224 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.430212 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:31Z\\\",\\\"message\\\":\\\" local for Pod openshift-ovn-kubernetes/ovnkube-node-lx6tg in node crc\\\\nI1007 13:54:31.812296 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812303 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812309 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812310 6389 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v in node crc\\\\nI1007 13:54:31.812311 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812313 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-2f4zj\\\\nI1007 13:54:31.812322 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v after 0 failed attempt(s)\\\\nI1007 13:54:31.812325 6389 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1007 13:54:31.812282 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1007 13:54:31.812336 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1007 13:54:31.812348 6389 ovn.go:134] Ensur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.441420 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.453388 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.466624 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.477336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.477401 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.477421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.477445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.477461 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:49Z","lastTransitionTime":"2025-10-07T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.486499 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.502864 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.518053 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.531710 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.560927 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.577123 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.580022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.580149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.580165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.580185 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.580198 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:49Z","lastTransitionTime":"2025-10-07T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.589562 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:49Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.682951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.683019 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.683031 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.683047 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.683059 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:49Z","lastTransitionTime":"2025-10-07T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.785255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.785323 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.785336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.785355 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.785367 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:49Z","lastTransitionTime":"2025-10-07T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.867723 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:49 crc kubenswrapper[4717]: E1007 13:54:49.867896 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.887526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.887570 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.887581 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.887598 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.887610 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:49Z","lastTransitionTime":"2025-10-07T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.990415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.990454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.990463 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.990479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:49 crc kubenswrapper[4717]: I1007 13:54:49.990488 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:49Z","lastTransitionTime":"2025-10-07T13:54:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.092403 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.092435 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.092445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.092459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.092470 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:50Z","lastTransitionTime":"2025-10-07T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.194720 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.194785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.194795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.194814 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.194828 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:50Z","lastTransitionTime":"2025-10-07T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.297265 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.297302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.297312 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.297344 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.297354 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:50Z","lastTransitionTime":"2025-10-07T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.400093 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.400123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.400134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.400147 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.400155 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:50Z","lastTransitionTime":"2025-10-07T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.503116 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.503168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.503180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.503195 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.503207 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:50Z","lastTransitionTime":"2025-10-07T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.605718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.605758 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.605770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.605786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.605798 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:50Z","lastTransitionTime":"2025-10-07T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.707102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.707410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.707490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.707583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.707674 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:50Z","lastTransitionTime":"2025-10-07T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.810099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.810352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.810414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.810480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.810536 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:50Z","lastTransitionTime":"2025-10-07T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.868070 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.868158 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:50 crc kubenswrapper[4717]: E1007 13:54:50.868185 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:50 crc kubenswrapper[4717]: E1007 13:54:50.868281 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.868066 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:50 crc kubenswrapper[4717]: E1007 13:54:50.868390 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.912965 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.913220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.913289 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.913358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:50 crc kubenswrapper[4717]: I1007 13:54:50.913429 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:50Z","lastTransitionTime":"2025-10-07T13:54:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.016584 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.016640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.016651 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.016670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.016683 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:51Z","lastTransitionTime":"2025-10-07T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.119453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.119507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.119522 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.119541 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.119554 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:51Z","lastTransitionTime":"2025-10-07T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.222190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.222236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.222246 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.222260 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.222271 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:51Z","lastTransitionTime":"2025-10-07T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.324805 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.324839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.324847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.324868 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.324877 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:51Z","lastTransitionTime":"2025-10-07T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.426649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.426688 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.426697 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.426718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.426728 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:51Z","lastTransitionTime":"2025-10-07T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.528856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.528890 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.528900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.528914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.528925 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:51Z","lastTransitionTime":"2025-10-07T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.631565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.631609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.631624 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.631642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.631655 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:51Z","lastTransitionTime":"2025-10-07T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.734428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.734475 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.734486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.734503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.734513 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:51Z","lastTransitionTime":"2025-10-07T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.836955 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.836997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.837033 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.837048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.837061 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:51Z","lastTransitionTime":"2025-10-07T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.867494 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:51 crc kubenswrapper[4717]: E1007 13:54:51.867884 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.939403 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.939456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.939467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.939481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:51 crc kubenswrapper[4717]: I1007 13:54:51.939489 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:51Z","lastTransitionTime":"2025-10-07T13:54:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.041522 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.041567 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.041576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.041592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.041602 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:52Z","lastTransitionTime":"2025-10-07T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.144348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.144385 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.144395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.144408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.144417 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:52Z","lastTransitionTime":"2025-10-07T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.246733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.246786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.246799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.246814 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.246824 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:52Z","lastTransitionTime":"2025-10-07T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.349589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.349894 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.349989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.350110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.350220 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:52Z","lastTransitionTime":"2025-10-07T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.456259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.456290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.456300 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.456313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.456322 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:52Z","lastTransitionTime":"2025-10-07T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.558885 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.558938 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.558952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.558969 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.558980 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:52Z","lastTransitionTime":"2025-10-07T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.661070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.661101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.661110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.661125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.661134 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:52Z","lastTransitionTime":"2025-10-07T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.763642 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.763684 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.763695 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.763713 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.763725 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:52Z","lastTransitionTime":"2025-10-07T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.866867 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.866902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.866914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.866930 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.866941 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:52Z","lastTransitionTime":"2025-10-07T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.867355 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:52 crc kubenswrapper[4717]: E1007 13:54:52.867463 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.867678 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:52 crc kubenswrapper[4717]: E1007 13:54:52.867750 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.867975 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:52 crc kubenswrapper[4717]: E1007 13:54:52.868071 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.969543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.969852 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.969864 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.969880 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:52 crc kubenswrapper[4717]: I1007 13:54:52.969894 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:52Z","lastTransitionTime":"2025-10-07T13:54:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.072550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.072607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.072616 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.072629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.072638 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:53Z","lastTransitionTime":"2025-10-07T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.174290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.174328 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.174339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.174354 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.174365 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:53Z","lastTransitionTime":"2025-10-07T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.276757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.276803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.276816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.276833 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.276845 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:53Z","lastTransitionTime":"2025-10-07T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.378576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.378619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.378629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.378641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.378651 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:53Z","lastTransitionTime":"2025-10-07T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.480617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.480678 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.480691 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.480710 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.480724 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:53Z","lastTransitionTime":"2025-10-07T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.582639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.582688 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.582698 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.582715 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.582727 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:53Z","lastTransitionTime":"2025-10-07T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.685083 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.685124 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.685135 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.685151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.685161 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:53Z","lastTransitionTime":"2025-10-07T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.786640 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.786683 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.786695 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.786711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.786723 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:53Z","lastTransitionTime":"2025-10-07T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.868384 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:53 crc kubenswrapper[4717]: E1007 13:54:53.868588 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.889415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.889453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.889463 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.889477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.889487 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:53Z","lastTransitionTime":"2025-10-07T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.991868 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.991918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.991930 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.991948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:53 crc kubenswrapper[4717]: I1007 13:54:53.991961 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:53Z","lastTransitionTime":"2025-10-07T13:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.094311 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.094355 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.094367 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.094383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.094394 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:54Z","lastTransitionTime":"2025-10-07T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.196990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.197053 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.197068 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.197084 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.197098 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:54Z","lastTransitionTime":"2025-10-07T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.298885 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.298926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.298936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.298952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.298962 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:54Z","lastTransitionTime":"2025-10-07T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.401408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.401448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.401457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.401472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.401481 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:54Z","lastTransitionTime":"2025-10-07T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.504419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.504480 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.504491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.504509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.504519 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:54Z","lastTransitionTime":"2025-10-07T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.607088 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.607121 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.607130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.607142 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.607151 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:54Z","lastTransitionTime":"2025-10-07T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.708855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.708903 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.708916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.708933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.708946 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:54Z","lastTransitionTime":"2025-10-07T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.811723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.811765 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.811775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.811787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.811796 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:54Z","lastTransitionTime":"2025-10-07T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.870536 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:54 crc kubenswrapper[4717]: E1007 13:54:54.870654 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.870814 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:54 crc kubenswrapper[4717]: E1007 13:54:54.870859 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.870949 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:54 crc kubenswrapper[4717]: E1007 13:54:54.870996 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.913786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.913819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.913827 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.913839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:54 crc kubenswrapper[4717]: I1007 13:54:54.913850 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:54Z","lastTransitionTime":"2025-10-07T13:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.017549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.017586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.017596 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.017612 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.017623 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:55Z","lastTransitionTime":"2025-10-07T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.120081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.120119 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.120131 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.120146 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.120156 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:55Z","lastTransitionTime":"2025-10-07T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.221847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.221912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.221921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.221939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.221948 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:55Z","lastTransitionTime":"2025-10-07T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.324745 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.324797 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.324808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.324825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.324836 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:55Z","lastTransitionTime":"2025-10-07T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.427326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.427388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.427398 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.427414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.427425 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:55Z","lastTransitionTime":"2025-10-07T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.530060 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.530110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.530122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.530139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.530150 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:55Z","lastTransitionTime":"2025-10-07T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.632961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.633002 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.633018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.633058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.633071 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:55Z","lastTransitionTime":"2025-10-07T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.735576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.735638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.735649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.735665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.735676 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:55Z","lastTransitionTime":"2025-10-07T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.838279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.838322 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.838334 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.838348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.838358 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:55Z","lastTransitionTime":"2025-10-07T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.867699 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:55 crc kubenswrapper[4717]: E1007 13:54:55.867878 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.941450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.941511 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.941532 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.941554 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:55 crc kubenswrapper[4717]: I1007 13:54:55.941577 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:55Z","lastTransitionTime":"2025-10-07T13:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.044899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.044946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.044956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.044973 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.044985 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.147803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.147849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.147859 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.147877 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.147889 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.250447 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.250486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.250500 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.250515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.250526 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.351968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.352026 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.352042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.352058 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.352068 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.455259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.455302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.455311 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.455334 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.455346 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.557972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.558000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.558041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.558059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.558068 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.650297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.650388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.650405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.650425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.650439 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: E1007 13:54:56.664335 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:56Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.668242 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.668286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.668298 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.668314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.668327 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: E1007 13:54:56.679460 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:56Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.683072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.683111 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.683122 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.683137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.683147 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: E1007 13:54:56.694313 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:56Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.698101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.698294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.698380 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.698462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.698543 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: E1007 13:54:56.710648 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:56Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.714347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.714414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.714427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.714445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.714458 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: E1007 13:54:56.727208 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:56Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:56 crc kubenswrapper[4717]: E1007 13:54:56.727333 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.728877 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.728963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.728976 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.728992 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.729006 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.830703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.830736 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.830747 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.830760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.830770 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.867843 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:56 crc kubenswrapper[4717]: E1007 13:54:56.867970 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.867863 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:56 crc kubenswrapper[4717]: E1007 13:54:56.868093 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.867844 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:56 crc kubenswrapper[4717]: E1007 13:54:56.868211 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.932833 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.932880 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.932890 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.932906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:56 crc kubenswrapper[4717]: I1007 13:54:56.932916 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:56Z","lastTransitionTime":"2025-10-07T13:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.034688 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.034724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.034733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.034748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.034758 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:57Z","lastTransitionTime":"2025-10-07T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.137181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.137219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.137227 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.137239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.137248 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:57Z","lastTransitionTime":"2025-10-07T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.239268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.239308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.239321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.239335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.239346 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:57Z","lastTransitionTime":"2025-10-07T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.342184 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.342243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.342260 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.342285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.342304 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:57Z","lastTransitionTime":"2025-10-07T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.444898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.444954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.444972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.444997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.445037 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:57Z","lastTransitionTime":"2025-10-07T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.548112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.548165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.548179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.548205 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.548221 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:57Z","lastTransitionTime":"2025-10-07T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.652371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.652436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.652498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.652516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.652529 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:57Z","lastTransitionTime":"2025-10-07T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.755429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.755504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.755515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.755530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.755542 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:57Z","lastTransitionTime":"2025-10-07T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.858247 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.858324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.858337 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.858353 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.858369 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:57Z","lastTransitionTime":"2025-10-07T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.867612 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:57 crc kubenswrapper[4717]: E1007 13:54:57.867724 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.881687 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.960583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.960627 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.960639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.960656 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:57 crc kubenswrapper[4717]: I1007 13:54:57.960667 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:57Z","lastTransitionTime":"2025-10-07T13:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.063075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.063123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.063136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.063152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.063163 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:58Z","lastTransitionTime":"2025-10-07T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.166368 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.166429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.166440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.166457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.166469 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:58Z","lastTransitionTime":"2025-10-07T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.268367 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.268407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.268417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.268431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.268443 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:58Z","lastTransitionTime":"2025-10-07T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.371173 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.371226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.371235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.371248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.371259 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:58Z","lastTransitionTime":"2025-10-07T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.473465 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.473539 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.473549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.473561 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.473571 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:58Z","lastTransitionTime":"2025-10-07T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.577453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.577514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.577525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.577547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.577562 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:58Z","lastTransitionTime":"2025-10-07T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.681826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.682217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.682325 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.682428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.682535 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:58Z","lastTransitionTime":"2025-10-07T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.785478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.785821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.785921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.786055 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.786247 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:58Z","lastTransitionTime":"2025-10-07T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.868373 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:54:58 crc kubenswrapper[4717]: E1007 13:54:58.868768 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.868516 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:54:58 crc kubenswrapper[4717]: E1007 13:54:58.868967 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.868476 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:54:58 crc kubenswrapper[4717]: E1007 13:54:58.869227 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.889756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.889815 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.889829 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.889848 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.889860 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:58Z","lastTransitionTime":"2025-10-07T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.890373 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.903286 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.918417 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.936524 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:31Z\\\",\\\"message\\\":\\\" local for Pod openshift-ovn-kubernetes/ovnkube-node-lx6tg in node crc\\\\nI1007 13:54:31.812296 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812303 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812309 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812310 6389 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v in node crc\\\\nI1007 13:54:31.812311 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812313 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-2f4zj\\\\nI1007 13:54:31.812322 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v after 0 failed attempt(s)\\\\nI1007 13:54:31.812325 6389 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1007 13:54:31.812282 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1007 13:54:31.812336 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1007 13:54:31.812348 6389 ovn.go:134] Ensur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.948911 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.970116 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.986074 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.992217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.992280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.992295 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.992313 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.992329 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:58Z","lastTransitionTime":"2025-10-07T13:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:58 crc kubenswrapper[4717]: I1007 13:54:58.999777 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:58Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.015308 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.027958 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.040008 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.053616 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:47Z\\\",\\\"message\\\":\\\"2025-10-07T13:54:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19\\\\n2025-10-07T13:54:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19 to /host/opt/cni/bin/\\\\n2025-10-07T13:54:02Z [verbose] multus-daemon started\\\\n2025-10-07T13:54:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:54:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.068950 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.080608 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9d31f5e-8b0f-4c80-9c2f-8d91b72a3a8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6eb23ce00314407d6673bf4c94e784c7bb5cf07582e2f51538884da28baa5c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.095653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.095711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.095724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.095744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.095761 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:59Z","lastTransitionTime":"2025-10-07T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.099307 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.113775 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.128471 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.146197 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.159797 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:54:59Z is after 2025-08-24T17:21:41Z" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.199911 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.199985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.200005 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.200070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.200092 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:59Z","lastTransitionTime":"2025-10-07T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.302931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.303032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.303049 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.303074 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.303088 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:59Z","lastTransitionTime":"2025-10-07T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.406121 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.406414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.406495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.406573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.406671 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:59Z","lastTransitionTime":"2025-10-07T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.509876 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.509943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.509956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.509986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.510005 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:59Z","lastTransitionTime":"2025-10-07T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.612161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.612204 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.612216 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.612232 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.612243 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:59Z","lastTransitionTime":"2025-10-07T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.714886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.714933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.714944 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.714960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.714969 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:59Z","lastTransitionTime":"2025-10-07T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.817869 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.817913 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.817922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.817935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.817944 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:59Z","lastTransitionTime":"2025-10-07T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.867555 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:54:59 crc kubenswrapper[4717]: E1007 13:54:59.867690 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.920173 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.920235 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.920249 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.920271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:54:59 crc kubenswrapper[4717]: I1007 13:54:59.920292 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:54:59Z","lastTransitionTime":"2025-10-07T13:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.023202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.023248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.023261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.023277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.023288 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:00Z","lastTransitionTime":"2025-10-07T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.126900 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.126963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.126974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.126989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.126998 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:00Z","lastTransitionTime":"2025-10-07T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.229476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.229521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.229532 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.229548 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.229559 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:00Z","lastTransitionTime":"2025-10-07T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.331515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.331554 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.331562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.331576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.331588 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:00Z","lastTransitionTime":"2025-10-07T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.434335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.434376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.434388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.434425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.434434 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:00Z","lastTransitionTime":"2025-10-07T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.537339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.537409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.537420 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.537435 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.537447 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:00Z","lastTransitionTime":"2025-10-07T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.640339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.640380 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.640390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.640405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.640416 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:00Z","lastTransitionTime":"2025-10-07T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.742917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.742954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.742965 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.742980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.742991 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:00Z","lastTransitionTime":"2025-10-07T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.845762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.845810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.845823 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.845950 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.846607 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:00Z","lastTransitionTime":"2025-10-07T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.867854 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.867906 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:00 crc kubenswrapper[4717]: E1007 13:55:00.867972 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:00 crc kubenswrapper[4717]: E1007 13:55:00.868064 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.868181 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:00 crc kubenswrapper[4717]: E1007 13:55:00.868484 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.949437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.949509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.949531 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.949552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:00 crc kubenswrapper[4717]: I1007 13:55:00.949565 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:00Z","lastTransitionTime":"2025-10-07T13:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.052882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.052944 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.052956 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.052977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.052991 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:01Z","lastTransitionTime":"2025-10-07T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.155921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.155974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.155988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.156039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.156061 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:01Z","lastTransitionTime":"2025-10-07T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.258315 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.258378 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.258395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.258419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.258436 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:01Z","lastTransitionTime":"2025-10-07T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.360070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.360123 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.360141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.360166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.360182 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:01Z","lastTransitionTime":"2025-10-07T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.462693 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.462780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.462800 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.462825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.462842 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:01Z","lastTransitionTime":"2025-10-07T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.564941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.564987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.564996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.565032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.565041 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:01Z","lastTransitionTime":"2025-10-07T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.667463 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.667506 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.667516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.667532 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.667542 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:01Z","lastTransitionTime":"2025-10-07T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.770199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.770240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.770252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.770267 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.770277 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:01Z","lastTransitionTime":"2025-10-07T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.868181 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:01 crc kubenswrapper[4717]: E1007 13:55:01.868338 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.873241 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.873300 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.873317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.873349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.873367 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:01Z","lastTransitionTime":"2025-10-07T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.975087 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.975121 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.975133 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.975149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:01 crc kubenswrapper[4717]: I1007 13:55:01.975160 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:01Z","lastTransitionTime":"2025-10-07T13:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.077670 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.077709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.077718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.077731 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.077740 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:02Z","lastTransitionTime":"2025-10-07T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.180619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.180659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.180669 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.180683 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.180692 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:02Z","lastTransitionTime":"2025-10-07T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.282821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.282873 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.282887 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.282907 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.282919 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:02Z","lastTransitionTime":"2025-10-07T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.385321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.385646 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.385659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.385675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.385689 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:02Z","lastTransitionTime":"2025-10-07T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.488825 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.488881 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.488893 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.488914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.488928 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:02Z","lastTransitionTime":"2025-10-07T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.592046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.592095 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.592110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.592132 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.592196 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:02Z","lastTransitionTime":"2025-10-07T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.695140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.695213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.695227 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.695253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.695268 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:02Z","lastTransitionTime":"2025-10-07T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.783166 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.783515 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.783462593 +0000 UTC m=+148.611388545 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.783607 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.783770 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.783836 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.783905 4717 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.784079 4717 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.784110 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.784161 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.784177 4717 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.784086 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.784045838 +0000 UTC m=+148.611971670 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.784252 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.784238713 +0000 UTC m=+148.612164675 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.784279 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.784267304 +0000 UTC m=+148.612193326 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.797466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.797512 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.797524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.797539 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.797549 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:02Z","lastTransitionTime":"2025-10-07T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.868404 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.868454 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.868548 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.868545 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.868665 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.869039 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.869352 4717 scope.go:117] "RemoveContainer" containerID="3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.884431 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.884629 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.884649 4717 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.884660 4717 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:55:02 crc kubenswrapper[4717]: E1007 13:55:02.884708 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.88469419 +0000 UTC m=+148.712619982 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.900279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.900319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.900351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.900369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:02 crc kubenswrapper[4717]: I1007 13:55:02.900382 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:02Z","lastTransitionTime":"2025-10-07T13:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.003262 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.003299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.003308 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.003323 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.003332 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:03Z","lastTransitionTime":"2025-10-07T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.105782 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.105823 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.105834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.105846 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.105854 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:03Z","lastTransitionTime":"2025-10-07T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.208433 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.208491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.208504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.208524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.208538 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:03Z","lastTransitionTime":"2025-10-07T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.311143 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.311177 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.311185 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.311199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.311208 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:03Z","lastTransitionTime":"2025-10-07T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.361740 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/2.log" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.363932 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.364323 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.376941 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.390529 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.402496 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.413000 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.413533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.413569 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.413577 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.413590 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.413601 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:03Z","lastTransitionTime":"2025-10-07T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.422912 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9d31f5e-8b0f-4c80-9c2f-8d91b72a3a8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6eb23ce00314407d6673bf4c94e784c7bb5cf07582e2f51538884da28baa5c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.436746 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.460196 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.477733 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:31Z\\\",\\\"message\\\":\\\" local for Pod openshift-ovn-kubernetes/ovnkube-node-lx6tg in node crc\\\\nI1007 13:54:31.812296 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812303 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812309 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812310 6389 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v in node crc\\\\nI1007 13:54:31.812311 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812313 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-2f4zj\\\\nI1007 13:54:31.812322 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v after 0 failed attempt(s)\\\\nI1007 13:54:31.812325 6389 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1007 13:54:31.812282 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1007 13:54:31.812336 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1007 13:54:31.812348 6389 ovn.go:134] Ensur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.489406 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.503922 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.515760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.515808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.515819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.515836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.515848 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:03Z","lastTransitionTime":"2025-10-07T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.517640 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.530957 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.544166 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.555418 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.567241 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.587488 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.600836 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.617942 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:47Z\\\",\\\"message\\\":\\\"2025-10-07T13:54:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19\\\\n2025-10-07T13:54:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19 to /host/opt/cni/bin/\\\\n2025-10-07T13:54:02Z [verbose] multus-daemon started\\\\n2025-10-07T13:54:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:54:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.618207 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.618239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.618248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.618261 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.618270 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:03Z","lastTransitionTime":"2025-10-07T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.636283 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:03Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.720924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.720968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.720982 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.720999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.721028 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:03Z","lastTransitionTime":"2025-10-07T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.823580 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.823613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.823624 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.823639 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.823651 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:03Z","lastTransitionTime":"2025-10-07T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.867830 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:03 crc kubenswrapper[4717]: E1007 13:55:03.867956 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.926710 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.926780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.926796 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.926816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:03 crc kubenswrapper[4717]: I1007 13:55:03.926835 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:03Z","lastTransitionTime":"2025-10-07T13:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.029767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.029808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.029819 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.029835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.029847 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:04Z","lastTransitionTime":"2025-10-07T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.132515 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.132562 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.132573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.132590 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.132600 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:04Z","lastTransitionTime":"2025-10-07T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.235517 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.235580 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.235595 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.235610 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.235619 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:04Z","lastTransitionTime":"2025-10-07T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.338157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.338193 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.338213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.338229 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.338240 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:04Z","lastTransitionTime":"2025-10-07T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.368081 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/3.log" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.368693 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/2.log" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.371682 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" exitCode=1 Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.371734 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.371773 4717 scope.go:117] "RemoveContainer" containerID="3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.372444 4717 scope.go:117] "RemoveContainer" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 13:55:04 crc kubenswrapper[4717]: E1007 13:55:04.372708 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.384934 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:47Z\\\",\\\"message\\\":\\\"2025-10-07T13:54:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19\\\\n2025-10-07T13:54:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19 to /host/opt/cni/bin/\\\\n2025-10-07T13:54:02Z [verbose] multus-daemon started\\\\n2025-10-07T13:54:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:54:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.397781 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.409085 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9d31f5e-8b0f-4c80-9c2f-8d91b72a3a8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6eb23ce00314407d6673bf4c94e784c7bb5cf07582e2f51538884da28baa5c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.421390 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.433003 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.440935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.440970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.440980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.440994 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.441025 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:04Z","lastTransitionTime":"2025-10-07T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.443960 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.455001 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.464433 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.476379 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.487781 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.502157 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.519338 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae1e0f7fd7356d01953dabc7c9000a4c1619be40042624b5dcc2d52f89591bb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:31Z\\\",\\\"message\\\":\\\" local for Pod openshift-ovn-kubernetes/ovnkube-node-lx6tg in node crc\\\\nI1007 13:54:31.812296 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812303 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812309 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-x869g\\\\nI1007 13:54:31.812310 6389 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v in node crc\\\\nI1007 13:54:31.812311 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI1007 13:54:31.812313 6389 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/machine-config-daemon-2f4zj\\\\nI1007 13:54:31.812322 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v after 0 failed attempt(s)\\\\nI1007 13:54:31.812325 6389 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI1007 13:54:31.812282 6389 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1007 13:54:31.812336 6389 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI1007 13:54:31.812348 6389 ovn.go:134] Ensur\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:55:03Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI1007 13:55:03.629082 6788 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1007 13:55:03.629099 6788 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1007 13:55:03.629134 6788 factory.go:1336] Added *v1.Node event handler 7\\\\nI1007 13:55:03.629163 6788 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:55:03.629181 6788 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:55:03.629191 6788 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:55:03.629180 6788 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1007 13:55:03.629225 6788 factory.go:656] Stopping watch factory\\\\nI1007 13:55:03.629241 6788 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:55:03.629695 6788 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1007 13:55:03.629778 6788 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1007 13:55:03.629810 6788 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:55:03.629838 6788 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:55:03.629910 6788 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:55:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.529737 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.543082 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.543129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.543138 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.543151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.543160 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:04Z","lastTransitionTime":"2025-10-07T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.548053 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.567085 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.587699 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.605686 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.615358 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.624849 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:04Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.645544 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.645578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.645587 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.645600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.645610 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:04Z","lastTransitionTime":"2025-10-07T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.747675 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.747708 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.747716 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.747730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.747739 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:04Z","lastTransitionTime":"2025-10-07T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.849524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.849591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.849607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.849622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.849632 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:04Z","lastTransitionTime":"2025-10-07T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.868055 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.868107 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.868163 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:04 crc kubenswrapper[4717]: E1007 13:55:04.868197 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:04 crc kubenswrapper[4717]: E1007 13:55:04.868292 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:04 crc kubenswrapper[4717]: E1007 13:55:04.868372 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.952243 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.952291 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.952299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.952318 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:04 crc kubenswrapper[4717]: I1007 13:55:04.952329 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:04Z","lastTransitionTime":"2025-10-07T13:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.054622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.054659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.054668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.054681 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.054690 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:05Z","lastTransitionTime":"2025-10-07T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.157431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.157464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.157474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.157490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.157501 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:05Z","lastTransitionTime":"2025-10-07T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.260493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.260559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.260573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.260600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.260615 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:05Z","lastTransitionTime":"2025-10-07T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.362805 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.362856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.362867 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.362917 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.362930 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:05Z","lastTransitionTime":"2025-10-07T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.376441 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/3.log" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.379645 4717 scope.go:117] "RemoveContainer" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 13:55:05 crc kubenswrapper[4717]: E1007 13:55:05.379794 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.392486 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.405222 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.419368 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9d31f5e-8b0f-4c80-9c2f-8d91b72a3a8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6eb23ce00314407d6673bf4c94e784c7bb5cf07582e2f51538884da28baa5c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.438117 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.454056 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.465583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.465629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.465645 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.465665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.465680 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:05Z","lastTransitionTime":"2025-10-07T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.472244 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.486911 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.502660 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.517358 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.534623 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.553006 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:55:03Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI1007 13:55:03.629082 6788 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1007 13:55:03.629099 6788 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1007 13:55:03.629134 6788 factory.go:1336] Added *v1.Node event handler 7\\\\nI1007 13:55:03.629163 6788 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:55:03.629181 6788 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:55:03.629191 6788 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:55:03.629180 6788 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1007 13:55:03.629225 6788 factory.go:656] Stopping watch factory\\\\nI1007 13:55:03.629241 6788 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:55:03.629695 6788 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1007 13:55:03.629778 6788 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1007 13:55:03.629810 6788 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:55:03.629838 6788 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:55:03.629910 6788 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:55:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.564370 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.568319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.568383 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.568396 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.568423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.568437 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:05Z","lastTransitionTime":"2025-10-07T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.574082 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.593675 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.607284 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.619204 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.631990 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.646210 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:47Z\\\",\\\"message\\\":\\\"2025-10-07T13:54:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19\\\\n2025-10-07T13:54:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19 to /host/opt/cni/bin/\\\\n2025-10-07T13:54:02Z [verbose] multus-daemon started\\\\n2025-10-07T13:54:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:54:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.661084 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:05Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.670938 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.671008 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.671035 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.671051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.671060 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:05Z","lastTransitionTime":"2025-10-07T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.774163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.774208 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.774220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.774236 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.774248 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:05Z","lastTransitionTime":"2025-10-07T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.867758 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:05 crc kubenswrapper[4717]: E1007 13:55:05.868068 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.877530 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.877590 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.877611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.877638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.877663 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:05Z","lastTransitionTime":"2025-10-07T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.981837 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.981880 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.981892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.981922 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:05 crc kubenswrapper[4717]: I1007 13:55:05.981934 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:05Z","lastTransitionTime":"2025-10-07T13:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.085507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.085578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.085601 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.085633 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.085655 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.189183 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.189252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.189263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.189283 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.189295 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.292971 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.293055 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.293067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.293083 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.293093 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.396219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.396271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.396285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.396305 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.396321 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.499036 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.499084 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.499102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.499118 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.499128 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.601804 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.601864 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.601879 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.601898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.601910 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.704874 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.704912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.704920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.704933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.704942 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.807641 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.807721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.807745 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.807773 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.807790 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.852679 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.852729 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.852744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.852768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.852792 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.867829 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.867844 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.867933 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:06 crc kubenswrapper[4717]: E1007 13:55:06.867861 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:06 crc kubenswrapper[4717]: E1007 13:55:06.868003 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:06 crc kubenswrapper[4717]: E1007 13:55:06.868074 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:06 crc kubenswrapper[4717]: E1007 13:55:06.868140 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.871904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.871975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.872000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.872075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.872093 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: E1007 13:55:06.886557 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.890373 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.890428 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.890442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.890463 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.890480 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: E1007 13:55:06.908825 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.913987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.914110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.914126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.914150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.914170 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: E1007 13:55:06.928044 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.932986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.933055 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.933071 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.933096 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.933115 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:06 crc kubenswrapper[4717]: E1007 13:55:06.948226 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:06Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:06 crc kubenswrapper[4717]: E1007 13:55:06.948348 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.950252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.950362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.950420 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.950497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:06 crc kubenswrapper[4717]: I1007 13:55:06.950574 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:06Z","lastTransitionTime":"2025-10-07T13:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.053048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.053101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.053113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.053130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.053143 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:07Z","lastTransitionTime":"2025-10-07T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.156754 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.156821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.156836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.156860 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.156878 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:07Z","lastTransitionTime":"2025-10-07T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.260176 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.260237 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.260250 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.260271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.260284 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:07Z","lastTransitionTime":"2025-10-07T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.363821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.363898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.363927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.363962 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.363996 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:07Z","lastTransitionTime":"2025-10-07T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.466573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.466607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.466616 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.466629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.466639 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:07Z","lastTransitionTime":"2025-10-07T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.570125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.570213 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.570233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.570255 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.570292 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:07Z","lastTransitionTime":"2025-10-07T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.673153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.673196 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.673205 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.673219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.673228 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:07Z","lastTransitionTime":"2025-10-07T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.775299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.775335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.775347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.775382 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.775395 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:07Z","lastTransitionTime":"2025-10-07T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.867599 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:07 crc kubenswrapper[4717]: E1007 13:55:07.867913 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.878597 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.878676 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.878691 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.878709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.878725 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:07Z","lastTransitionTime":"2025-10-07T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.981794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.981838 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.981849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.981864 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:07 crc kubenswrapper[4717]: I1007 13:55:07.981877 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:07Z","lastTransitionTime":"2025-10-07T13:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.085381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.085471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.085499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.085526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.085543 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:08Z","lastTransitionTime":"2025-10-07T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.188374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.188430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.188443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.188464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.188478 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:08Z","lastTransitionTime":"2025-10-07T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.291459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.291567 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.291583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.291609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.291625 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:08Z","lastTransitionTime":"2025-10-07T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.396483 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.396575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.396593 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.396619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.396637 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:08Z","lastTransitionTime":"2025-10-07T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.500099 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.500164 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.500188 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.500210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.500247 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:08Z","lastTransitionTime":"2025-10-07T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.602411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.602469 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.602481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.602498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.602512 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:08Z","lastTransitionTime":"2025-10-07T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.704808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.704871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.704883 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.704899 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.704909 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:08Z","lastTransitionTime":"2025-10-07T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.807264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.807307 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.807317 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.807332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.807342 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:08Z","lastTransitionTime":"2025-10-07T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.868352 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.868419 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.868436 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:08 crc kubenswrapper[4717]: E1007 13:55:08.868541 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:08 crc kubenswrapper[4717]: E1007 13:55:08.868655 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:08 crc kubenswrapper[4717]: E1007 13:55:08.868825 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.884661 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:47Z\\\",\\\"message\\\":\\\"2025-10-07T13:54:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19\\\\n2025-10-07T13:54:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19 to /host/opt/cni/bin/\\\\n2025-10-07T13:54:02Z [verbose] multus-daemon started\\\\n2025-10-07T13:54:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:54:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.901963 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.910223 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.910281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.910298 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.910323 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.910338 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:08Z","lastTransitionTime":"2025-10-07T13:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.920370 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.935484 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.951024 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.964649 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.976174 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9d31f5e-8b0f-4c80-9c2f-8d91b72a3a8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6eb23ce00314407d6673bf4c94e784c7bb5cf07582e2f51538884da28baa5c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:08 crc kubenswrapper[4717]: I1007 13:55:08.992397 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:08Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.014343 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.014424 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.014439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.014462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.014476 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:09Z","lastTransitionTime":"2025-10-07T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.014965 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.037588 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:55:03Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI1007 13:55:03.629082 6788 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1007 13:55:03.629099 6788 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1007 13:55:03.629134 6788 factory.go:1336] Added *v1.Node event handler 7\\\\nI1007 13:55:03.629163 6788 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:55:03.629181 6788 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:55:03.629191 6788 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:55:03.629180 6788 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1007 13:55:03.629225 6788 factory.go:656] Stopping watch factory\\\\nI1007 13:55:03.629241 6788 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:55:03.629695 6788 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1007 13:55:03.629778 6788 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1007 13:55:03.629810 6788 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:55:03.629838 6788 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:55:03.629910 6788 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:55:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.050869 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.062847 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.074570 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.088496 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.106567 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.116553 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.116613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.116626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.116646 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.116661 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:09Z","lastTransitionTime":"2025-10-07T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.119877 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.133935 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.151621 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.162410 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:09Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.220100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.220150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.220162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.220179 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.220190 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:09Z","lastTransitionTime":"2025-10-07T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.321948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.321999 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.322027 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.322043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.322054 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:09Z","lastTransitionTime":"2025-10-07T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.424709 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.424753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.424767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.424783 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.424796 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:09Z","lastTransitionTime":"2025-10-07T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.526884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.526940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.526960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.526982 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.526999 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:09Z","lastTransitionTime":"2025-10-07T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.629491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.629536 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.629547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.629563 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.629575 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:09Z","lastTransitionTime":"2025-10-07T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.732374 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.732433 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.732446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.732461 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.732472 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:09Z","lastTransitionTime":"2025-10-07T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.834547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.835270 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.835368 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.835446 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.835486 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:09Z","lastTransitionTime":"2025-10-07T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.867588 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:09 crc kubenswrapper[4717]: E1007 13:55:09.867782 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.938680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.938731 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.938744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.938761 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:09 crc kubenswrapper[4717]: I1007 13:55:09.938774 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:09Z","lastTransitionTime":"2025-10-07T13:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.041429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.041482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.041494 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.041514 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.041527 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:10Z","lastTransitionTime":"2025-10-07T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.144679 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.144752 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.144775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.144798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.144813 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:10Z","lastTransitionTime":"2025-10-07T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.248062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.248101 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.248110 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.248127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.248137 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:10Z","lastTransitionTime":"2025-10-07T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.350489 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.350528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.350545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.350575 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.350588 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:10Z","lastTransitionTime":"2025-10-07T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.452297 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.452337 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.452346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.452361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.452371 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:10Z","lastTransitionTime":"2025-10-07T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.555549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.555592 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.555603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.555617 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.555627 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:10Z","lastTransitionTime":"2025-10-07T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.657622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.657657 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.657665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.657677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.657686 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:10Z","lastTransitionTime":"2025-10-07T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.760752 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.760789 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.760797 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.760811 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.760819 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:10Z","lastTransitionTime":"2025-10-07T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.863353 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.863407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.863422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.863441 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.863454 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:10Z","lastTransitionTime":"2025-10-07T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.867651 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.867681 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.867652 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:10 crc kubenswrapper[4717]: E1007 13:55:10.867795 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:10 crc kubenswrapper[4717]: E1007 13:55:10.867909 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:10 crc kubenswrapper[4717]: E1007 13:55:10.868119 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.967525 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.967590 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.967603 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.967627 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:10 crc kubenswrapper[4717]: I1007 13:55:10.967645 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:10Z","lastTransitionTime":"2025-10-07T13:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.069975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.070057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.070072 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.070089 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.070103 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:11Z","lastTransitionTime":"2025-10-07T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.172928 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.172981 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.172996 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.173036 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.173055 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:11Z","lastTransitionTime":"2025-10-07T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.276048 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.276137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.276163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.276199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.276224 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:11Z","lastTransitionTime":"2025-10-07T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.378863 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.378912 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.378921 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.378936 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.378946 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:11Z","lastTransitionTime":"2025-10-07T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.481171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.481239 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.481252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.481271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.481283 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:11Z","lastTransitionTime":"2025-10-07T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.583803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.583888 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.583906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.583935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.583954 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:11Z","lastTransitionTime":"2025-10-07T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.687279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.687341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.687366 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.687394 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.687413 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:11Z","lastTransitionTime":"2025-10-07T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.791025 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.791070 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.791078 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.791092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.791101 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:11Z","lastTransitionTime":"2025-10-07T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.867310 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:11 crc kubenswrapper[4717]: E1007 13:55:11.867474 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.893915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.893984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.893998 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.894054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.894072 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:11Z","lastTransitionTime":"2025-10-07T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.997332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.997401 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.997414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.997436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:11 crc kubenswrapper[4717]: I1007 13:55:11.997455 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:11Z","lastTransitionTime":"2025-10-07T13:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.100540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.100591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.100607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.100626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.100641 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:12Z","lastTransitionTime":"2025-10-07T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.203660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.203702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.203712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.203725 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.203734 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:12Z","lastTransitionTime":"2025-10-07T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.306376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.306442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.306451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.306463 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.306472 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:12Z","lastTransitionTime":"2025-10-07T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.408440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.408495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.408504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.408516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.408527 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:12Z","lastTransitionTime":"2025-10-07T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.510839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.510898 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.510918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.510941 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.510959 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:12Z","lastTransitionTime":"2025-10-07T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.613206 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.613286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.613298 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.613318 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.613330 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:12Z","lastTransitionTime":"2025-10-07T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.715586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.715626 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.715634 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.715648 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.715657 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:12Z","lastTransitionTime":"2025-10-07T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.817490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.817563 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.817586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.817616 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.817637 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:12Z","lastTransitionTime":"2025-10-07T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.867398 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.867459 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.867429 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:12 crc kubenswrapper[4717]: E1007 13:55:12.867619 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:12 crc kubenswrapper[4717]: E1007 13:55:12.867714 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:12 crc kubenswrapper[4717]: E1007 13:55:12.867838 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.919548 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.919582 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.919591 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.919604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:12 crc kubenswrapper[4717]: I1007 13:55:12.919615 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:12Z","lastTransitionTime":"2025-10-07T13:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.023038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.023137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.023180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.023202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.023213 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:13Z","lastTransitionTime":"2025-10-07T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.125878 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.125934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.125946 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.125960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.125970 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:13Z","lastTransitionTime":"2025-10-07T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.228887 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.228954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.228970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.229035 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.229050 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:13Z","lastTransitionTime":"2025-10-07T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.331392 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.331445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.331456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.331477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.331496 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:13Z","lastTransitionTime":"2025-10-07T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.434405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.434471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.434481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.434507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.434521 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:13Z","lastTransitionTime":"2025-10-07T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.536810 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.536872 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.536881 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.536896 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.536907 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:13Z","lastTransitionTime":"2025-10-07T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.639478 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.639521 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.639533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.639550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.639560 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:13Z","lastTransitionTime":"2025-10-07T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.741844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.741884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.741892 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.741904 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.741913 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:13Z","lastTransitionTime":"2025-10-07T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.844768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.844813 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.844822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.844835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.844845 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:13Z","lastTransitionTime":"2025-10-07T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.868316 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:13 crc kubenswrapper[4717]: E1007 13:55:13.868479 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.946655 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.946712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.946724 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.946741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:13 crc kubenswrapper[4717]: I1007 13:55:13.946753 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:13Z","lastTransitionTime":"2025-10-07T13:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.049557 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.049607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.049619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.049636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.049647 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:14Z","lastTransitionTime":"2025-10-07T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.152406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.152475 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.152485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.152505 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.152521 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:14Z","lastTransitionTime":"2025-10-07T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.256075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.256141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.256154 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.256172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.256185 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:14Z","lastTransitionTime":"2025-10-07T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.360091 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.360181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.360209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.360248 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.360271 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:14Z","lastTransitionTime":"2025-10-07T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.463636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.464106 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.464226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.464310 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.464394 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:14Z","lastTransitionTime":"2025-10-07T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.566873 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.566920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.566933 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.566951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.566970 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:14Z","lastTransitionTime":"2025-10-07T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.669816 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.669870 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.669882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.669902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.669915 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:14Z","lastTransitionTime":"2025-10-07T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.772787 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.772835 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.772846 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.772888 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.772899 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:14Z","lastTransitionTime":"2025-10-07T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.867594 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.867748 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:14 crc kubenswrapper[4717]: E1007 13:55:14.867929 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.867965 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:14 crc kubenswrapper[4717]: E1007 13:55:14.868110 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:14 crc kubenswrapper[4717]: E1007 13:55:14.868248 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.875180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.875226 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.875245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.875269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.875285 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:14Z","lastTransitionTime":"2025-10-07T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.977371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.977411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.977421 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.977437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:14 crc kubenswrapper[4717]: I1007 13:55:14.977448 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:14Z","lastTransitionTime":"2025-10-07T13:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.078974 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.079079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.079107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.079137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.079160 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:15Z","lastTransitionTime":"2025-10-07T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.181888 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.181952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.181965 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.181991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.182023 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:15Z","lastTransitionTime":"2025-10-07T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.285469 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.285539 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.285558 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.285586 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.285605 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:15Z","lastTransitionTime":"2025-10-07T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.388294 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.388328 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.388336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.388350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.388359 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:15Z","lastTransitionTime":"2025-10-07T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.490339 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.490377 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.490403 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.490419 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.490428 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:15Z","lastTransitionTime":"2025-10-07T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.592391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.592439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.592450 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.592466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.592480 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:15Z","lastTransitionTime":"2025-10-07T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.695233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.695304 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.695318 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.695336 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.695348 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:15Z","lastTransitionTime":"2025-10-07T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.797950 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.797992 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.798022 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.798039 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.798048 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:15Z","lastTransitionTime":"2025-10-07T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.867963 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:15 crc kubenswrapper[4717]: E1007 13:55:15.868106 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.900979 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.901032 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.901043 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.901069 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:15 crc kubenswrapper[4717]: I1007 13:55:15.901086 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:15Z","lastTransitionTime":"2025-10-07T13:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.002919 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.002952 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.002959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.002972 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.002981 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:16Z","lastTransitionTime":"2025-10-07T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.105402 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.105473 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.105484 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.105501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.105512 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:16Z","lastTransitionTime":"2025-10-07T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.207947 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.207987 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.207997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.208056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.208073 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:16Z","lastTransitionTime":"2025-10-07T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.309984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.310062 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.310075 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.310097 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.310114 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:16Z","lastTransitionTime":"2025-10-07T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.411638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.411682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.411692 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.411705 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.411715 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:16Z","lastTransitionTime":"2025-10-07T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.513682 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.513761 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.513775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.513798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.513812 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:16Z","lastTransitionTime":"2025-10-07T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.617059 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.617121 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.617134 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.617156 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.617171 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:16Z","lastTransitionTime":"2025-10-07T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.719939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.720034 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.720051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.720080 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.720095 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:16Z","lastTransitionTime":"2025-10-07T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.822394 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.822448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.822460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.822479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.822490 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:16Z","lastTransitionTime":"2025-10-07T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.867344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.867456 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:16 crc kubenswrapper[4717]: E1007 13:55:16.867515 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.867789 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:16 crc kubenswrapper[4717]: E1007 13:55:16.868220 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.868267 4717 scope.go:117] "RemoveContainer" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 13:55:16 crc kubenswrapper[4717]: E1007 13:55:16.868358 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:16 crc kubenswrapper[4717]: E1007 13:55:16.868408 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.924417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.924454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.924462 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.924476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:16 crc kubenswrapper[4717]: I1007 13:55:16.924485 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:16Z","lastTransitionTime":"2025-10-07T13:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.026395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.026436 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.026449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.026464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.026475 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.128888 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.128927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.128935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.128949 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.128959 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.231390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.231422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.231432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.231453 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.231468 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.300414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.300457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.300469 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.300488 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.300499 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: E1007 13:55:17.312081 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.316302 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.316337 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.316346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.316671 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.316698 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: E1007 13:55:17.328881 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.332439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.332466 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.332475 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.332490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.332501 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: E1007 13:55:17.344108 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.347608 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.347654 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.347668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.347686 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.347698 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: E1007 13:55:17.360562 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.363410 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.363442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.363456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.363472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.363483 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: E1007 13:55:17.373769 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:17Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:17 crc kubenswrapper[4717]: E1007 13:55:17.373926 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.375092 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.375115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.375126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.375141 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.375154 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.477565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.477618 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.477629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.477644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.477656 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.580430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.580481 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.580491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.580504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.580513 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.682882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.682934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.682944 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.682959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.682975 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.785528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.785578 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.785588 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.785604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.785617 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.867774 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:17 crc kubenswrapper[4717]: E1007 13:55:17.867908 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.887460 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.887509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.887520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.887535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.887547 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.990400 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.990448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.990458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.990474 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:17 crc kubenswrapper[4717]: I1007 13:55:17.990486 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:17Z","lastTransitionTime":"2025-10-07T13:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.046338 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:18 crc kubenswrapper[4717]: E1007 13:55:18.046517 4717 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:55:18 crc kubenswrapper[4717]: E1007 13:55:18.046610 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs podName:004bf989-60a1-4a45-bb4d-fc6a41829f3d nodeName:}" failed. No retries permitted until 2025-10-07 13:56:22.046590665 +0000 UTC m=+163.874516457 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs") pod "network-metrics-daemon-vl8rk" (UID: "004bf989-60a1-4a45-bb4d-fc6a41829f3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.093076 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.093125 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.093136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.093155 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.093165 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:18Z","lastTransitionTime":"2025-10-07T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.195293 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.195346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.195357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.195376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.195387 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:18Z","lastTransitionTime":"2025-10-07T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.297821 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.297890 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.297901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.297918 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.297928 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:18Z","lastTransitionTime":"2025-10-07T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.400194 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.400252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.400269 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.400292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.400308 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:18Z","lastTransitionTime":"2025-10-07T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.503369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.503473 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.503498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.503526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.503547 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:18Z","lastTransitionTime":"2025-10-07T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.605535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.605589 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.605604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.605622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.605633 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:18Z","lastTransitionTime":"2025-10-07T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.708168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.708252 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.708276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.708309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.708329 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:18Z","lastTransitionTime":"2025-10-07T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.810064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.810136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.810156 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.810182 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.810200 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:18Z","lastTransitionTime":"2025-10-07T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.867763 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.867868 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:18 crc kubenswrapper[4717]: E1007 13:55:18.867927 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.867967 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:18 crc kubenswrapper[4717]: E1007 13:55:18.868168 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:18 crc kubenswrapper[4717]: E1007 13:55:18.868304 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.887317 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.899059 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.913268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.913321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.913338 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.913364 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.913385 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:18Z","lastTransitionTime":"2025-10-07T13:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.916684 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.929959 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.941115 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.951841 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9d31f5e-8b0f-4c80-9c2f-8d91b72a3a8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6eb23ce00314407d6673bf4c94e784c7bb5cf07582e2f51538884da28baa5c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.964347 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:18 crc kubenswrapper[4717]: I1007 13:55:18.982823 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:18Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.007503 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:55:03Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI1007 13:55:03.629082 6788 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1007 13:55:03.629099 6788 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1007 13:55:03.629134 6788 factory.go:1336] Added *v1.Node event handler 7\\\\nI1007 13:55:03.629163 6788 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:55:03.629181 6788 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:55:03.629191 6788 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:55:03.629180 6788 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1007 13:55:03.629225 6788 factory.go:656] Stopping watch factory\\\\nI1007 13:55:03.629241 6788 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:55:03.629695 6788 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1007 13:55:03.629778 6788 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1007 13:55:03.629810 6788 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:55:03.629838 6788 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:55:03.629910 6788 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:55:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.015527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.015574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.015585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.015600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.015612 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:19Z","lastTransitionTime":"2025-10-07T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.020810 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.032287 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.042089 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.053051 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.064145 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.072768 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.083211 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.106222 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.117347 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.117882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.117914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.117924 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.117939 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.117949 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:19Z","lastTransitionTime":"2025-10-07T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.133168 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:47Z\\\",\\\"message\\\":\\\"2025-10-07T13:54:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19\\\\n2025-10-07T13:54:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19 to /host/opt/cni/bin/\\\\n2025-10-07T13:54:02Z [verbose] multus-daemon started\\\\n2025-10-07T13:54:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:54:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:19Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.220192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.220275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.220296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.220319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.220340 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:19Z","lastTransitionTime":"2025-10-07T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.324517 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.324583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.324595 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.324619 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.324633 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:19Z","lastTransitionTime":"2025-10-07T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.428333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.428413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.428431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.428457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.428471 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:19Z","lastTransitionTime":"2025-10-07T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.531594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.531696 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.531717 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.531747 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.531765 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:19Z","lastTransitionTime":"2025-10-07T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.633812 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.633926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.633945 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.633970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.633987 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:19Z","lastTransitionTime":"2025-10-07T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.737068 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.737136 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.737150 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.737172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.737185 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:19Z","lastTransitionTime":"2025-10-07T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.846335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.846376 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.846388 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.846404 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.846416 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:19Z","lastTransitionTime":"2025-10-07T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.868224 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:19 crc kubenswrapper[4717]: E1007 13:55:19.868644 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.948472 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.948520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.948532 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.948549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:19 crc kubenswrapper[4717]: I1007 13:55:19.948563 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:19Z","lastTransitionTime":"2025-10-07T13:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.051598 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.051650 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.051662 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.051680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.051693 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:20Z","lastTransitionTime":"2025-10-07T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.154797 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.154842 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.154854 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.154902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.154918 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:20Z","lastTransitionTime":"2025-10-07T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.257794 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.257844 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.257856 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.257874 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.257886 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:20Z","lastTransitionTime":"2025-10-07T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.360910 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.360967 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.360977 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.360993 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.361023 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:20Z","lastTransitionTime":"2025-10-07T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.463430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.463497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.463507 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.463523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.463533 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:20Z","lastTransitionTime":"2025-10-07T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.565843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.565901 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.565916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.565934 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.565943 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:20Z","lastTransitionTime":"2025-10-07T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.668805 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.668848 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.668858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.668871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.668882 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:20Z","lastTransitionTime":"2025-10-07T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.771029 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.771077 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.771091 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.771115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.771128 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:20Z","lastTransitionTime":"2025-10-07T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.867952 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.868037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.868144 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:20 crc kubenswrapper[4717]: E1007 13:55:20.868133 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:20 crc kubenswrapper[4717]: E1007 13:55:20.868279 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:20 crc kubenswrapper[4717]: E1007 13:55:20.868414 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.873285 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.873319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.873328 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.873340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.873351 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:20Z","lastTransitionTime":"2025-10-07T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.975607 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.975644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.975653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.975668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:20 crc kubenswrapper[4717]: I1007 13:55:20.975677 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:20Z","lastTransitionTime":"2025-10-07T13:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.077846 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.077881 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.077890 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.077906 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.077925 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:21Z","lastTransitionTime":"2025-10-07T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.179457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.179498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.179508 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.179523 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.179534 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:21Z","lastTransitionTime":"2025-10-07T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.282273 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.282309 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.282320 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.282354 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.282366 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:21Z","lastTransitionTime":"2025-10-07T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.385044 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.385171 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.385190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.385208 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.385254 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:21Z","lastTransitionTime":"2025-10-07T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.487573 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.487618 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.487631 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.487649 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.487660 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:21Z","lastTransitionTime":"2025-10-07T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.590348 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.590391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.590401 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.590415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.590424 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:21Z","lastTransitionTime":"2025-10-07T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.693306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.693369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.693377 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.693390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.693400 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:21Z","lastTransitionTime":"2025-10-07T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.795284 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.795326 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.795335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.795349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.795358 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:21Z","lastTransitionTime":"2025-10-07T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.868289 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:21 crc kubenswrapper[4717]: E1007 13:55:21.868678 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.897498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.897543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.897554 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.897570 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:21 crc kubenswrapper[4717]: I1007 13:55:21.897582 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:21Z","lastTransitionTime":"2025-10-07T13:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.000042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.000102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.000113 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.000129 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.000142 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:22Z","lastTransitionTime":"2025-10-07T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.101902 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.101951 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.101961 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.101978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.101994 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:22Z","lastTransitionTime":"2025-10-07T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.204217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.204264 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.204275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.204290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.204302 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:22Z","lastTransitionTime":"2025-10-07T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.306296 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.306334 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.306342 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.306355 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.306366 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:22Z","lastTransitionTime":"2025-10-07T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.408493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.408547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.408559 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.408576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.408586 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:22Z","lastTransitionTime":"2025-10-07T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.511191 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.511265 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.511286 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.511304 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.511323 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:22Z","lastTransitionTime":"2025-10-07T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.613712 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.613753 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.613763 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.613779 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.613790 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:22Z","lastTransitionTime":"2025-10-07T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.716528 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.716565 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.716574 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.716588 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.716599 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:22Z","lastTransitionTime":"2025-10-07T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.819097 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.819137 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.819151 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.819166 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.819176 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:22Z","lastTransitionTime":"2025-10-07T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.868087 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.868087 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.868218 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:22 crc kubenswrapper[4717]: E1007 13:55:22.868357 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:22 crc kubenswrapper[4717]: E1007 13:55:22.868469 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:22 crc kubenswrapper[4717]: E1007 13:55:22.868569 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.921104 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.921139 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.921149 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.921163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:22 crc kubenswrapper[4717]: I1007 13:55:22.921172 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:22Z","lastTransitionTime":"2025-10-07T13:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.023165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.023221 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.023240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.023256 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.023267 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:23Z","lastTransitionTime":"2025-10-07T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.125281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.125321 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.125330 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.125346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.125358 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:23Z","lastTransitionTime":"2025-10-07T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.226998 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.227064 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.227079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.227096 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.227106 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:23Z","lastTransitionTime":"2025-10-07T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.329622 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.329660 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.329668 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.329680 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.329690 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:23Z","lastTransitionTime":"2025-10-07T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.432464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.432520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.432529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.432545 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.432557 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:23Z","lastTransitionTime":"2025-10-07T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.534042 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.534112 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.534126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.534144 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.534156 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:23Z","lastTransitionTime":"2025-10-07T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.636217 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.636271 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.636287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.636306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.636319 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:23Z","lastTransitionTime":"2025-10-07T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.738104 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.738153 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.738165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.738181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.738194 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:23Z","lastTransitionTime":"2025-10-07T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.839940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.839980 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.839989 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.840202 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.840212 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:23Z","lastTransitionTime":"2025-10-07T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.867559 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:23 crc kubenswrapper[4717]: E1007 13:55:23.867684 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.942550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.942600 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.942609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.942624 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:23 crc kubenswrapper[4717]: I1007 13:55:23.942635 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:23Z","lastTransitionTime":"2025-10-07T13:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.045305 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.045343 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.045351 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.045365 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.045374 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:24Z","lastTransitionTime":"2025-10-07T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.148714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.148757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.148768 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.148790 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.148809 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:24Z","lastTransitionTime":"2025-10-07T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.252718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.252798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.252826 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.252860 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.252882 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:24Z","lastTransitionTime":"2025-10-07T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.356908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.356997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.357049 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.357077 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.357097 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:24Z","lastTransitionTime":"2025-10-07T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.460157 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.460210 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.460222 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.460242 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.460253 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:24Z","lastTransitionTime":"2025-10-07T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.562663 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.562723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.562733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.562758 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.562770 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:24Z","lastTransitionTime":"2025-10-07T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.666100 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.666152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.666161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.666178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.666192 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:24Z","lastTransitionTime":"2025-10-07T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.769948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.770120 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.770161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.770199 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.770225 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:24Z","lastTransitionTime":"2025-10-07T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.867796 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.867878 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.867809 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:24 crc kubenswrapper[4717]: E1007 13:55:24.868130 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:24 crc kubenswrapper[4717]: E1007 13:55:24.868345 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:24 crc kubenswrapper[4717]: E1007 13:55:24.868585 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.872435 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.872498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.872519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.872547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.872566 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:24Z","lastTransitionTime":"2025-10-07T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.976259 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.976320 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.976335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.976359 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:24 crc kubenswrapper[4717]: I1007 13:55:24.976377 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:24Z","lastTransitionTime":"2025-10-07T13:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.079178 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.079258 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.079277 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.079306 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.079326 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:25Z","lastTransitionTime":"2025-10-07T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.182425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.182485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.182496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.182519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.182533 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:25Z","lastTransitionTime":"2025-10-07T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.284667 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.284711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.284721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.284737 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.284751 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:25Z","lastTransitionTime":"2025-10-07T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.388667 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.388781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.388807 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.388843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.388878 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:25Z","lastTransitionTime":"2025-10-07T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.492223 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.492299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.492319 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.492349 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.492369 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:25Z","lastTransitionTime":"2025-10-07T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.595822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.595882 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.595894 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.595915 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.595927 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:25Z","lastTransitionTime":"2025-10-07T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.698678 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.698730 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.698741 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.698757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.698768 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:25Z","lastTransitionTime":"2025-10-07T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.802701 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.802728 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.802738 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.802750 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.802758 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:25Z","lastTransitionTime":"2025-10-07T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.868120 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:25 crc kubenswrapper[4717]: E1007 13:55:25.868285 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.905190 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.905219 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.905230 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.905245 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:25 crc kubenswrapper[4717]: I1007 13:55:25.905255 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:25Z","lastTransitionTime":"2025-10-07T13:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.007775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.007871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.007894 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.007923 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.007944 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:26Z","lastTransitionTime":"2025-10-07T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.110556 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.110594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.110606 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.110621 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.110632 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:26Z","lastTransitionTime":"2025-10-07T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.213560 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.213604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.213613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.213629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.213639 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:26Z","lastTransitionTime":"2025-10-07T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.316404 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.316458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.316477 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.316497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.316512 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:26Z","lastTransitionTime":"2025-10-07T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.419128 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.419168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.419180 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.419195 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.419209 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:26Z","lastTransitionTime":"2025-10-07T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.521493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.521527 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.521535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.521547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.521556 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:26Z","lastTransitionTime":"2025-10-07T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.623869 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.623929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.623944 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.623960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.623972 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:26Z","lastTransitionTime":"2025-10-07T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.725940 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.725995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.726030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.726057 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.726075 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:26Z","lastTransitionTime":"2025-10-07T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.828623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.828673 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.828687 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.828703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.828714 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:26Z","lastTransitionTime":"2025-10-07T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.867501 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.867508 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.867616 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:26 crc kubenswrapper[4717]: E1007 13:55:26.867818 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:26 crc kubenswrapper[4717]: E1007 13:55:26.867889 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:26 crc kubenswrapper[4717]: E1007 13:55:26.868039 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.931442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.931506 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.931535 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.931550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:26 crc kubenswrapper[4717]: I1007 13:55:26.931559 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:26Z","lastTransitionTime":"2025-10-07T13:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.033953 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.034041 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.034051 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.034067 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.034077 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.136332 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.136382 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.136394 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.136411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.136422 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.238448 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.238495 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.238506 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.238524 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.238538 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.340908 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.340957 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.340971 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.340991 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.341027 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.443745 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.443845 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.443872 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.443907 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.443930 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.547677 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.547734 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.547744 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.547761 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.547771 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.609391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.609437 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.609447 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.609463 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.609475 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: E1007 13:55:27.622657 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:27Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.626423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.626476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.626486 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.626501 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.626511 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: E1007 13:55:27.648747 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:27Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.652357 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.652485 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.652572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.652646 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.652720 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: E1007 13:55:27.667585 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:27Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.671767 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.671799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.671809 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.671822 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.671830 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: E1007 13:55:27.686233 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:27Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.690571 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.690613 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.690623 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.690636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.690646 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: E1007 13:55:27.703664 4717 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-07T13:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"181bd669-3920-47f2-a947-b62e480db854\\\",\\\"systemUUID\\\":\\\"1df5aeeb-e72c-4f95-84ca-4d7d0e672fce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:27Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:27 crc kubenswrapper[4717]: E1007 13:55:27.703842 4717 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.705611 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.705636 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.705644 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.705658 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.705667 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.807629 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.807960 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.808090 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.808189 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.808284 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.868100 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:27 crc kubenswrapper[4717]: E1007 13:55:27.868232 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.911065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.911362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.911433 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.911498 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:27 crc kubenswrapper[4717]: I1007 13:55:27.911575 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:27Z","lastTransitionTime":"2025-10-07T13:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.013652 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.013929 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.014046 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.014115 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.014183 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:28Z","lastTransitionTime":"2025-10-07T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.116550 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.116585 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.116595 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.116609 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.116620 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:28Z","lastTransitionTime":"2025-10-07T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.219054 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.219276 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.219366 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.219432 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.219516 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:28Z","lastTransitionTime":"2025-10-07T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.321870 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.322543 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.322635 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.322718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.322789 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:28Z","lastTransitionTime":"2025-10-07T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.425504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.425549 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.425560 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.425576 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.425586 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:28Z","lastTransitionTime":"2025-10-07T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.528914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.529081 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.529126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.529165 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.529189 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:28Z","lastTransitionTime":"2025-10-07T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.632337 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.632381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.632391 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.632408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.632417 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:28Z","lastTransitionTime":"2025-10-07T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.735440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.735497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.735511 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.735529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.735571 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:28Z","lastTransitionTime":"2025-10-07T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.839324 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.839427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.839454 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.839497 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.839519 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:28Z","lastTransitionTime":"2025-10-07T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.867496 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:28 crc kubenswrapper[4717]: E1007 13:55:28.867788 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.867840 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:28 crc kubenswrapper[4717]: E1007 13:55:28.867988 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.868292 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:28 crc kubenswrapper[4717]: E1007 13:55:28.868498 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.889902 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-shhlh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf0d43cd-2fb1-490e-9de4-db923141bd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:54:47Z\\\",\\\"message\\\":\\\"2025-10-07T13:54:02+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19\\\\n2025-10-07T13:54:02+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9af0662c-b9a9-4205-9283-305979c11c19 to /host/opt/cni/bin/\\\\n2025-10-07T13:54:02Z [verbose] multus-daemon started\\\\n2025-10-07T13:54:02Z [verbose] Readiness Indicator file check\\\\n2025-10-07T13:54:47Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl82p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-shhlh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.905094 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bf5d3c23c0a7e695dc97b5707b0f85b07f1af59ef59935adfe98a58b89da83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hfcfx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-2f4zj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.918232 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.931454 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://181964af96964fc58ad778dca5854002d091f73383c3c36e6657364c576cd7fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.942311 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.942360 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.942369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.942385 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.942421 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:28Z","lastTransitionTime":"2025-10-07T13:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.944995 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"004bf989-60a1-4a45-bb4d-fc6a41829f3d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9vztg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vl8rk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.957548 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a9d31f5e-8b0f-4c80-9c2f-8d91b72a3a8a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6eb23ce00314407d6673bf4c94e784c7bb5cf07582e2f51538884da28baa5c75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://44a574de59265457d4dc17ae93e89fc3df0fb40cc82c76921e426c641b9ac414\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.974747 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"884ac09b-b5e7-4d1b-a255-c7a784ec0d7e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c73ee6a1d5d30aa298d34848cf3aea63d058d811c23666e6771d2ff3a2e6a21f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc8b14481ecc47815343fdc693ca6db48797aed4627d24b7e9ae47160e2d942\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5971cca097c803970b23e688dc02c9eead9781d6c27a8f026dc496fc59757b5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40fae667f74ae4794746ea40eba46b3dd5ca695f4d6a9b99512431b429deec47\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcab73636c42c1c438077eea630bc7cd3dec9b58f1e75ac65d3b4fd7c2b6bfc2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-07T13:53:52Z\\\",\\\"message\\\":\\\"W1007 13:53:42.237095 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1007 13:53:42.237668 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759845222 cert, and key in /tmp/serving-cert-153549800/serving-signer.crt, /tmp/serving-cert-153549800/serving-signer.key\\\\nI1007 13:53:42.459731 1 observer_polling.go:159] Starting file observer\\\\nW1007 13:53:42.463356 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1007 13:53:42.463489 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1007 13:53:42.464973 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-153549800/tls.crt::/tmp/serving-cert-153549800/tls.key\\\\\\\"\\\\nF1007 13:53:52.709842 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0a3f11dfd2f86cb9739bac33ea62b9e6494822a7a914703098fb1b0661ef925\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://226ca0fed0990557d1062142968a3acde4adc745e0850020384ac37a8bf7cbf9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:28 crc kubenswrapper[4717]: I1007 13:55:28.988138 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:28Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.007377 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d24a81c8-811e-41bf-ab5d-48590bc1e8df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-07T13:55:03Z\\\",\\\"message\\\":\\\"712973235162149816) with []\\\\nI1007 13:55:03.629082 6788 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI1007 13:55:03.629099 6788 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI1007 13:55:03.629134 6788 factory.go:1336] Added *v1.Node event handler 7\\\\nI1007 13:55:03.629163 6788 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1007 13:55:03.629181 6788 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1007 13:55:03.629191 6788 handler.go:208] Removed *v1.Node event handler 2\\\\nI1007 13:55:03.629180 6788 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1007 13:55:03.629225 6788 factory.go:656] Stopping watch factory\\\\nI1007 13:55:03.629241 6788 handler.go:208] Removed *v1.Node event handler 7\\\\nI1007 13:55:03.629695 6788 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1007 13:55:03.629778 6788 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1007 13:55:03.629810 6788 ovnkube.go:599] Stopped ovnkube\\\\nI1007 13:55:03.629838 6788 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1007 13:55:03.629910 6788 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-07T13:55:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2kq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lx6tg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.017935 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4dbc14f8-cf99-4ab1-ac35-06266f9b154e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://519e6e23f43c0292b22cf203c61abf978cf467d74dff0097099c7625f3774870\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8deea891a8ff1783e4622b9ebde80194b744ec8ed0f0ae27c745e4a92f247b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wnbrb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zj22v\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.029793 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbda53c22dfd35f93b894b46dd8ae936a6c2aeee9f64c20df16986dd39b6f4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a782e6422c525ee59ca3975eb6621b631b31b8cdbfbfba23aa6240ec5581d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.045413 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.045452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.045464 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.045479 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.045495 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:29Z","lastTransitionTime":"2025-10-07T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.045876 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:58Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.062236 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-znnjh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3fa5c083-f07a-43bb-adb4-724602f263b9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://29c8bc2bc5ec13fc8b8310955e15ea93a5c8848ce4d411ec6c9e5b821ecd876b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84ddedf1b963f8e033cf38676ddfda7b781478b174ada86b2c7e69c5f23c1080\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f36c566c1853f9ae2cdf1fc960afbf874803af8ea58bc258c69370ef766da2a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef769bb673917fc7a67c7ea11155be5e12ba8ddc3be89ff29ac13bab38f9a4f9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f1daf982bd3f4f957060599b488720923fe1371546c07030f5998a08abdbfc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://90b9f8f1fb8b5fb31a147a51d9cb212c5165606b4214f8bc42cf29b81f1ef7ed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://785e4c8b8702e6273e471b1be54e322f6fbd37f62ba7d0ec97f5a7f9d1b0235e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:54:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:54:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kff2m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:00Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-znnjh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.083947 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a7db9cf2189e98fbe28269057237383dcc00698203ca73c27ebfbddeeaf8df35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.096622 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-sn2rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8d528a9-b0d4-49e6-b782-0e1e3ce36745\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db34db1153f4554cee30c4de7f5760771dd150184ceb1f1a26d96847219462f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-75l4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-sn2rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.107279 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-x869g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"484b2799-a3d4-48d4-b7b3-46cd6aac9657\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a69bd337a57b5c08ca31d505827b4d7c62b1be637bdddf3eb31d428f5e89623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:54:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cps6g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:54:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-x869g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.125518 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2403a15-d632-4aac-9a0b-f5251cb2f8c8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bfc84b4ab83124f61e2217db3ec38ce3e411f07e063c41ace4c04323800e90f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f95ae88eae9f1ff90b1af0e1726bcb12244b59965d580c42430eef11ddb2df00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d57609516a7613293b73ccc3630f2b4cc9efe5cea1b6fda56fcc1420531e2778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937f1e1a27addca43343a789025bb0ecdd1bb61266c764079d858c7fc2a72282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://563e8c1aa8b8b4c56c4cc540e50fc2a617154cf643171f0ec5a6311773270ef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3785e247a64dbbeee8e45f27dbc24092ccfc0ab5588e946616b0e0c6d1f9bc8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a914e61d6bfc2d8c98c093f1bde8d38c3189b88aaa961877ec4d0cba090c55ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://37e04c811c457679704e376f24e2c32274d679c45de62887b28f04144f78e81a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.140938 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9355419b-39be-4a62-993f-0671a172419e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1c7fd1f48be27be8ee4ffe3a10e5a321fb13c72dae16e27758054de0229b7c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9120d28f9e9f0944ed344b51b8411b31ab66892cdf80acf139ea6f5c97368b32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69520b0dc8ef2fe9758fa0b2a367ce9957af275fe311eb9af2ea93103e75c089\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e16a423b447b8abc425e34b538fb3f36f9b7dfe9bf2e442420f5b58b897d0e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.147813 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.147861 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.147871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.147884 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.147892 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:29Z","lastTransitionTime":"2025-10-07T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.153688 4717 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"002c0c21-dbd5-40aa-8cb0-0bb23e6e9199\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:54:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-07T13:53:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://969b9525f6bb659d899499f543af21a3daf3d0530fb815072062db2f38ca0a43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b57aac95899453e6b4f7706797392ac76755fdb8ee292e1584f112f230f9fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://55a573eb367e8da158c4da328898380297bb572223f7cd7631f11c9669412d48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-07T13:53:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f092acceb875b39037a860efbd5074235aea295c11c06539ee45c0e8bb9e565\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-07T13:53:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-07T13:53:39Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-07T13:53:38Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-07T13:55:29Z is after 2025-08-24T17:21:41Z" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.250369 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.250405 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.250414 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.250427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.250435 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:29Z","lastTransitionTime":"2025-10-07T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.352944 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.352984 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.352997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.353038 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.353049 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:29Z","lastTransitionTime":"2025-10-07T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.455784 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.455828 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.455839 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.455855 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.455866 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:29Z","lastTransitionTime":"2025-10-07T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.558718 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.558764 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.558776 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.558792 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.558803 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:29Z","lastTransitionTime":"2025-10-07T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.660879 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.660916 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.660927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.660943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.660956 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:29Z","lastTransitionTime":"2025-10-07T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.762942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.762985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.763018 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.763030 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.763039 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:29Z","lastTransitionTime":"2025-10-07T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.865381 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.865422 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.865434 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.865451 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.865463 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:29Z","lastTransitionTime":"2025-10-07T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.868129 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:29 crc kubenswrapper[4717]: E1007 13:55:29.868374 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.967534 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.967770 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.967781 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.967795 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:29 crc kubenswrapper[4717]: I1007 13:55:29.967803 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:29Z","lastTransitionTime":"2025-10-07T13:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.070156 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.070192 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.070205 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.070223 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.070233 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:30Z","lastTransitionTime":"2025-10-07T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.173447 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.173496 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.173509 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.173526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.173538 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:30Z","lastTransitionTime":"2025-10-07T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.275883 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.275920 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.275930 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.275943 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.275952 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:30Z","lastTransitionTime":"2025-10-07T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.377707 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.377755 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.377766 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.377786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.377800 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:30Z","lastTransitionTime":"2025-10-07T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.480211 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.480268 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.480280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.480299 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.480309 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:30Z","lastTransitionTime":"2025-10-07T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.582594 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.582643 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.582653 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.582669 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.582678 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:30Z","lastTransitionTime":"2025-10-07T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.685240 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.685290 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.685300 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.685314 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.685322 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:30Z","lastTransitionTime":"2025-10-07T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.787911 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.787959 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.787968 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.787985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.787997 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:30Z","lastTransitionTime":"2025-10-07T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.868286 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.868424 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.868433 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:30 crc kubenswrapper[4717]: E1007 13:55:30.868531 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:30 crc kubenswrapper[4717]: E1007 13:55:30.868559 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:30 crc kubenswrapper[4717]: E1007 13:55:30.868608 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.869665 4717 scope.go:117] "RemoveContainer" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 13:55:30 crc kubenswrapper[4717]: E1007 13:55:30.870138 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lx6tg_openshift-ovn-kubernetes(d24a81c8-811e-41bf-ab5d-48590bc1e8df)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.889699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.889748 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.889762 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.889785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.889797 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:30Z","lastTransitionTime":"2025-10-07T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.992102 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.992145 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.992201 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.992222 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:30 crc kubenswrapper[4717]: I1007 13:55:30.992235 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:30Z","lastTransitionTime":"2025-10-07T13:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.094508 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.094540 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.094551 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.094568 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.094580 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:31Z","lastTransitionTime":"2025-10-07T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.197344 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.197375 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.197385 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.197397 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.197405 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:31Z","lastTransitionTime":"2025-10-07T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.299931 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.299971 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.299982 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.299997 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.300263 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:31Z","lastTransitionTime":"2025-10-07T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.402390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.402440 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.402456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.402473 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.402754 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:31Z","lastTransitionTime":"2025-10-07T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.505262 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.505303 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.505316 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.505334 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.505345 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:31Z","lastTransitionTime":"2025-10-07T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.606935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.606970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.606988 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.607000 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.607035 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:31Z","lastTransitionTime":"2025-10-07T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.709126 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.709161 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.709169 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.709181 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.709189 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:31Z","lastTransitionTime":"2025-10-07T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.811630 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.811679 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.811691 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.811707 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.811718 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:31Z","lastTransitionTime":"2025-10-07T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.868276 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:31 crc kubenswrapper[4717]: E1007 13:55:31.868414 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.913791 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.913843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.913851 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.913865 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:31 crc kubenswrapper[4717]: I1007 13:55:31.913873 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:31Z","lastTransitionTime":"2025-10-07T13:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.015812 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.015847 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.015858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.015873 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.015886 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:32Z","lastTransitionTime":"2025-10-07T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.117886 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.117926 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.117935 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.117948 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.117958 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:32Z","lastTransitionTime":"2025-10-07T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.220411 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.220467 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.220476 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.220490 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.220499 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:32Z","lastTransitionTime":"2025-10-07T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.323555 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.323799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.323894 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.323963 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.324062 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:32Z","lastTransitionTime":"2025-10-07T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.427361 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.427426 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.427438 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.427458 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.427472 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:32Z","lastTransitionTime":"2025-10-07T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.529861 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.530251 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.530347 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.530445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.530556 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:32Z","lastTransitionTime":"2025-10-07T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.633168 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.633416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.633499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.633625 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.633707 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:32Z","lastTransitionTime":"2025-10-07T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.736367 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.736407 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.736423 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.736439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.736453 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:32Z","lastTransitionTime":"2025-10-07T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.838522 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.839342 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.839377 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.839395 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.839410 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:32Z","lastTransitionTime":"2025-10-07T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.868245 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.868327 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.868255 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:32 crc kubenswrapper[4717]: E1007 13:55:32.868475 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:32 crc kubenswrapper[4717]: E1007 13:55:32.868727 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:32 crc kubenswrapper[4717]: E1007 13:55:32.868933 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.941859 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.941927 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.941954 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.941985 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:32 crc kubenswrapper[4717]: I1007 13:55:32.942059 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:32Z","lastTransitionTime":"2025-10-07T13:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.044671 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.044716 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.044726 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.044742 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.044753 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:33Z","lastTransitionTime":"2025-10-07T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.146995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.147105 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.147130 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.147163 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.147186 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:33Z","lastTransitionTime":"2025-10-07T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.249220 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.249471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.249484 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.249500 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.249512 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:33Z","lastTransitionTime":"2025-10-07T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.352533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.352572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.352588 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.352604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.352617 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:33Z","lastTransitionTime":"2025-10-07T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.455176 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.455209 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.455218 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.455233 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.455252 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:33Z","lastTransitionTime":"2025-10-07T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.557930 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.557975 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.557986 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.558020 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.558030 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:33Z","lastTransitionTime":"2025-10-07T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.661406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.661473 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.661484 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.661499 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.661511 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:33Z","lastTransitionTime":"2025-10-07T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.765604 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.765702 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.765747 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.765777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.765803 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:33Z","lastTransitionTime":"2025-10-07T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.867392 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:33 crc kubenswrapper[4717]: E1007 13:55:33.867552 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.869172 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.869341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.869386 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.869424 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.869448 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:33Z","lastTransitionTime":"2025-10-07T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.972459 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.972520 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.972539 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.972564 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:33 crc kubenswrapper[4717]: I1007 13:55:33.972583 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:33Z","lastTransitionTime":"2025-10-07T13:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.076760 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.076820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.076840 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.076869 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.076888 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:34Z","lastTransitionTime":"2025-10-07T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.179735 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.179777 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.179786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.179803 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.179813 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:34Z","lastTransitionTime":"2025-10-07T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.281995 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.282056 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.282065 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.282079 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.282088 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:34Z","lastTransitionTime":"2025-10-07T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.383799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.383834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.383843 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.383858 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.383867 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:34Z","lastTransitionTime":"2025-10-07T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.471817 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-shhlh_bf0d43cd-2fb1-490e-9de4-db923141bd43/kube-multus/1.log" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.472315 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-shhlh_bf0d43cd-2fb1-490e-9de4-db923141bd43/kube-multus/0.log" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.472360 4717 generic.go:334] "Generic (PLEG): container finished" podID="bf0d43cd-2fb1-490e-9de4-db923141bd43" containerID="c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0" exitCode=1 Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.472407 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-shhlh" event={"ID":"bf0d43cd-2fb1-490e-9de4-db923141bd43","Type":"ContainerDied","Data":"c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0"} Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.472481 4717 scope.go:117] "RemoveContainer" containerID="069fbfec37baf22fc9fbbd1e10ff621d916b0fc5cf87f598ab96c79609a8b2d2" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.473112 4717 scope.go:117] "RemoveContainer" containerID="c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0" Oct 07 13:55:34 crc kubenswrapper[4717]: E1007 13:55:34.473465 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-shhlh_openshift-multus(bf0d43cd-2fb1-490e-9de4-db923141bd43)\"" pod="openshift-multus/multus-shhlh" podUID="bf0d43cd-2fb1-490e-9de4-db923141bd43" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.487665 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.487733 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.487757 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.487785 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.487806 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:34Z","lastTransitionTime":"2025-10-07T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.497622 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=37.49760434 podStartE2EDuration="37.49760434s" podCreationTimestamp="2025-10-07 13:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:34.497462186 +0000 UTC m=+116.325387978" watchObservedRunningTime="2025-10-07 13:55:34.49760434 +0000 UTC m=+116.325530162" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.534075 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=96.534000859 podStartE2EDuration="1m36.534000859s" podCreationTimestamp="2025-10-07 13:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:34.516531348 +0000 UTC m=+116.344457180" watchObservedRunningTime="2025-10-07 13:55:34.534000859 +0000 UTC m=+116.361926691" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.590085 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.590118 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.590127 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.590140 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.590150 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:34Z","lastTransitionTime":"2025-10-07T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.616711 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-znnjh" podStartSLOduration=95.616692638 podStartE2EDuration="1m35.616692638s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:34.616314408 +0000 UTC m=+116.444240220" watchObservedRunningTime="2025-10-07 13:55:34.616692638 +0000 UTC m=+116.444618440" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.650458 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zj22v" podStartSLOduration=94.650436297 podStartE2EDuration="1m34.650436297s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:34.650152849 +0000 UTC m=+116.478078671" watchObservedRunningTime="2025-10-07 13:55:34.650436297 +0000 UTC m=+116.478362099" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.674548 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=95.674532032 podStartE2EDuration="1m35.674532032s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:34.673605047 +0000 UTC m=+116.501530849" watchObservedRunningTime="2025-10-07 13:55:34.674532032 +0000 UTC m=+116.502457824" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.688265 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=94.688225913 podStartE2EDuration="1m34.688225913s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:34.687787491 +0000 UTC m=+116.515713313" watchObservedRunningTime="2025-10-07 13:55:34.688225913 +0000 UTC m=+116.516151705" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.694871 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.694914 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.694925 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.694942 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.694954 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:34Z","lastTransitionTime":"2025-10-07T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.703060 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=68.703045203 podStartE2EDuration="1m8.703045203s" podCreationTimestamp="2025-10-07 13:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:34.701887123 +0000 UTC m=+116.529812935" watchObservedRunningTime="2025-10-07 13:55:34.703045203 +0000 UTC m=+116.530970995" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.724852 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sn2rz" podStartSLOduration=95.724831877 podStartE2EDuration="1m35.724831877s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:34.72418326 +0000 UTC m=+116.552109052" watchObservedRunningTime="2025-10-07 13:55:34.724831877 +0000 UTC m=+116.552757669" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.734723 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-x869g" podStartSLOduration=95.734704127 podStartE2EDuration="1m35.734704127s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:34.734221655 +0000 UTC m=+116.562147477" watchObservedRunningTime="2025-10-07 13:55:34.734704127 +0000 UTC m=+116.562629919" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.757278 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podStartSLOduration=95.757263912 podStartE2EDuration="1m35.757263912s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:34.756933553 +0000 UTC m=+116.584859345" watchObservedRunningTime="2025-10-07 13:55:34.757263912 +0000 UTC m=+116.585189704" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.797531 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.797564 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.797572 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.797583 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.797592 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:34Z","lastTransitionTime":"2025-10-07T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.868351 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.868457 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:34 crc kubenswrapper[4717]: E1007 13:55:34.868485 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:34 crc kubenswrapper[4717]: E1007 13:55:34.868590 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.868838 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:34 crc kubenswrapper[4717]: E1007 13:55:34.869052 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.899638 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.899671 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.899699 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.899714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:34 crc kubenswrapper[4717]: I1007 13:55:34.899722 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:34Z","lastTransitionTime":"2025-10-07T13:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.002471 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.002517 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.002529 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.002547 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.002558 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:35Z","lastTransitionTime":"2025-10-07T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.105292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.105358 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.105371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.105408 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.105421 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:35Z","lastTransitionTime":"2025-10-07T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.207798 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.208107 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.208295 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.208430 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.208578 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:35Z","lastTransitionTime":"2025-10-07T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.311280 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.311340 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.311352 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.311371 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.311741 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:35Z","lastTransitionTime":"2025-10-07T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.414445 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.414482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.414516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.414533 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.414543 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:35Z","lastTransitionTime":"2025-10-07T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.478173 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-shhlh_bf0d43cd-2fb1-490e-9de4-db923141bd43/kube-multus/1.log" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.517333 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.517409 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.517431 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.517452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.517466 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:35Z","lastTransitionTime":"2025-10-07T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.619731 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.619771 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.619780 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.619799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.619810 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:35Z","lastTransitionTime":"2025-10-07T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.722731 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.722786 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.722799 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.722818 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.722831 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:35Z","lastTransitionTime":"2025-10-07T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.825231 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.825257 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.825266 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.825279 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.825288 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:35Z","lastTransitionTime":"2025-10-07T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.867681 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:35 crc kubenswrapper[4717]: E1007 13:55:35.867807 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.927456 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.927526 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.927538 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.927552 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:35 crc kubenswrapper[4717]: I1007 13:55:35.927561 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:35Z","lastTransitionTime":"2025-10-07T13:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.029710 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.029742 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.029751 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.029763 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.029772 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:36Z","lastTransitionTime":"2025-10-07T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.131664 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.131711 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.131723 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.131742 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.131757 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:36Z","lastTransitionTime":"2025-10-07T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.234439 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.234492 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.234519 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.234544 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.234562 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:36Z","lastTransitionTime":"2025-10-07T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.336393 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.336429 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.336438 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.336452 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.336461 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:36Z","lastTransitionTime":"2025-10-07T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.438631 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.438678 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.438706 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.438721 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.438732 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:36Z","lastTransitionTime":"2025-10-07T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.540375 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.540417 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.540427 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.540443 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.540453 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:36Z","lastTransitionTime":"2025-10-07T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.642083 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.642152 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.642162 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.642175 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.642215 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:36Z","lastTransitionTime":"2025-10-07T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.744377 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.744416 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.744425 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.744441 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.744450 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:36Z","lastTransitionTime":"2025-10-07T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.847200 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.847253 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.847265 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.847281 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.847297 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:36Z","lastTransitionTime":"2025-10-07T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.867461 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.867461 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.867580 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:36 crc kubenswrapper[4717]: E1007 13:55:36.867690 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:36 crc kubenswrapper[4717]: E1007 13:55:36.867829 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:36 crc kubenswrapper[4717]: E1007 13:55:36.867903 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.949775 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.949824 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.949834 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.949849 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:36 crc kubenswrapper[4717]: I1007 13:55:36.949859 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:36Z","lastTransitionTime":"2025-10-07T13:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.051937 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.051970 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.051978 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.051990 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.052001 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.154292 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.154335 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.154346 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.154362 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.154373 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.256659 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.256703 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.256714 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.256727 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.256738 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.359457 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.359503 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.359516 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.359531 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.359541 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.461345 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.461385 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.461393 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.461406 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.461415 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.564304 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.564341 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.564350 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.564364 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.564373 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.667329 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.667390 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.667415 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.667442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.667460 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.769442 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.769482 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.769491 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.769504 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.769513 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.867340 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:37 crc kubenswrapper[4717]: E1007 13:55:37.867473 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.872228 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.872263 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.872275 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.872287 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.872296 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.974449 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.974483 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.974493 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.974510 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.974521 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.989756 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.989808 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.989820 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.989836 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 07 13:55:37 crc kubenswrapper[4717]: I1007 13:55:37.989848 4717 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-07T13:55:37Z","lastTransitionTime":"2025-10-07T13:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.031121 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt"] Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.031445 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.033985 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.034136 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.034145 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.034860 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.036192 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.036271 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.036320 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.036342 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.036372 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.137592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.137641 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.137673 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.137705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.137739 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.138018 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.138080 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.138670 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.142762 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.153516 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37b6f88a-0938-48d5-ab0c-3b230e1a34e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s2cdt\" (UID: \"37b6f88a-0938-48d5-ab0c-3b230e1a34e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.345832 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.487696 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" event={"ID":"37b6f88a-0938-48d5-ab0c-3b230e1a34e8","Type":"ContainerStarted","Data":"e19abde16c5bae260f80870aec5864f1583faac174b18368a02991a25e453366"} Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.868260 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:38 crc kubenswrapper[4717]: E1007 13:55:38.869200 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.869305 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:38 crc kubenswrapper[4717]: I1007 13:55:38.869388 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:38 crc kubenswrapper[4717]: E1007 13:55:38.869455 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:38 crc kubenswrapper[4717]: E1007 13:55:38.869513 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:38 crc kubenswrapper[4717]: E1007 13:55:38.895409 4717 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 07 13:55:38 crc kubenswrapper[4717]: E1007 13:55:38.972738 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 13:55:39 crc kubenswrapper[4717]: I1007 13:55:39.493917 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" event={"ID":"37b6f88a-0938-48d5-ab0c-3b230e1a34e8","Type":"ContainerStarted","Data":"c35c25790042d18f1e7a25fe7173b6dad9ab7759bf9308724f396ebde671c991"} Oct 07 13:55:39 crc kubenswrapper[4717]: I1007 13:55:39.513292 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2cdt" podStartSLOduration=100.513257252 podStartE2EDuration="1m40.513257252s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:39.512505003 +0000 UTC m=+121.340430795" watchObservedRunningTime="2025-10-07 13:55:39.513257252 +0000 UTC m=+121.341183084" Oct 07 13:55:39 crc kubenswrapper[4717]: I1007 13:55:39.867976 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:39 crc kubenswrapper[4717]: E1007 13:55:39.868387 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:40 crc kubenswrapper[4717]: I1007 13:55:40.867679 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:40 crc kubenswrapper[4717]: I1007 13:55:40.867723 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:40 crc kubenswrapper[4717]: I1007 13:55:40.867749 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:40 crc kubenswrapper[4717]: E1007 13:55:40.868035 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:40 crc kubenswrapper[4717]: E1007 13:55:40.868136 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:40 crc kubenswrapper[4717]: E1007 13:55:40.868369 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:41 crc kubenswrapper[4717]: I1007 13:55:41.868002 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:41 crc kubenswrapper[4717]: E1007 13:55:41.868743 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:42 crc kubenswrapper[4717]: I1007 13:55:42.868394 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:42 crc kubenswrapper[4717]: I1007 13:55:42.868494 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:42 crc kubenswrapper[4717]: E1007 13:55:42.868557 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:42 crc kubenswrapper[4717]: E1007 13:55:42.868617 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:42 crc kubenswrapper[4717]: I1007 13:55:42.869056 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:42 crc kubenswrapper[4717]: E1007 13:55:42.869135 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:43 crc kubenswrapper[4717]: I1007 13:55:43.867588 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:43 crc kubenswrapper[4717]: E1007 13:55:43.867727 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:43 crc kubenswrapper[4717]: E1007 13:55:43.974338 4717 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 07 13:55:44 crc kubenswrapper[4717]: I1007 13:55:44.868038 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:44 crc kubenswrapper[4717]: I1007 13:55:44.868140 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:44 crc kubenswrapper[4717]: E1007 13:55:44.868187 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:44 crc kubenswrapper[4717]: I1007 13:55:44.867996 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:44 crc kubenswrapper[4717]: E1007 13:55:44.868313 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:44 crc kubenswrapper[4717]: E1007 13:55:44.868366 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:45 crc kubenswrapper[4717]: I1007 13:55:45.868033 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:45 crc kubenswrapper[4717]: E1007 13:55:45.868664 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:45 crc kubenswrapper[4717]: I1007 13:55:45.868968 4717 scope.go:117] "RemoveContainer" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 13:55:46 crc kubenswrapper[4717]: I1007 13:55:46.518272 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/3.log" Oct 07 13:55:46 crc kubenswrapper[4717]: I1007 13:55:46.521297 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerStarted","Data":"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf"} Oct 07 13:55:46 crc kubenswrapper[4717]: I1007 13:55:46.521748 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:55:46 crc kubenswrapper[4717]: I1007 13:55:46.684965 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podStartSLOduration=107.684929505 podStartE2EDuration="1m47.684929505s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:46.560564458 +0000 UTC m=+128.388490250" watchObservedRunningTime="2025-10-07 13:55:46.684929505 +0000 UTC m=+128.512855297" Oct 07 13:55:46 crc kubenswrapper[4717]: I1007 13:55:46.685509 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vl8rk"] Oct 07 13:55:46 crc kubenswrapper[4717]: I1007 13:55:46.685868 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:46 crc kubenswrapper[4717]: E1007 13:55:46.685994 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:46 crc kubenswrapper[4717]: I1007 13:55:46.868251 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:46 crc kubenswrapper[4717]: E1007 13:55:46.868611 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:46 crc kubenswrapper[4717]: I1007 13:55:46.868318 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:46 crc kubenswrapper[4717]: E1007 13:55:46.868673 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:46 crc kubenswrapper[4717]: I1007 13:55:46.868244 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:46 crc kubenswrapper[4717]: E1007 13:55:46.868724 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:46 crc kubenswrapper[4717]: I1007 13:55:46.868748 4717 scope.go:117] "RemoveContainer" containerID="c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0" Oct 07 13:55:47 crc kubenswrapper[4717]: I1007 13:55:47.526475 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-shhlh_bf0d43cd-2fb1-490e-9de4-db923141bd43/kube-multus/1.log" Oct 07 13:55:47 crc kubenswrapper[4717]: I1007 13:55:47.526592 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-shhlh" event={"ID":"bf0d43cd-2fb1-490e-9de4-db923141bd43","Type":"ContainerStarted","Data":"9be8cb1b9e7ff40152ee18884bc2a3f032539835fe1d90eae801fbf1487d0991"} Oct 07 13:55:47 crc kubenswrapper[4717]: I1007 13:55:47.547654 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-shhlh" podStartSLOduration=108.547619257 podStartE2EDuration="1m48.547619257s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:55:47.546342144 +0000 UTC m=+129.374267946" watchObservedRunningTime="2025-10-07 13:55:47.547619257 +0000 UTC m=+129.375545089" Oct 07 13:55:48 crc kubenswrapper[4717]: I1007 13:55:48.867899 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:48 crc kubenswrapper[4717]: I1007 13:55:48.870694 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:48 crc kubenswrapper[4717]: E1007 13:55:48.870684 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 07 13:55:48 crc kubenswrapper[4717]: I1007 13:55:48.870746 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:48 crc kubenswrapper[4717]: I1007 13:55:48.870768 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:48 crc kubenswrapper[4717]: E1007 13:55:48.870810 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 07 13:55:48 crc kubenswrapper[4717]: E1007 13:55:48.870879 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vl8rk" podUID="004bf989-60a1-4a45-bb4d-fc6a41829f3d" Oct 07 13:55:48 crc kubenswrapper[4717]: E1007 13:55:48.870999 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 07 13:55:50 crc kubenswrapper[4717]: I1007 13:55:50.867495 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:55:50 crc kubenswrapper[4717]: I1007 13:55:50.867531 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:55:50 crc kubenswrapper[4717]: I1007 13:55:50.867696 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:55:50 crc kubenswrapper[4717]: I1007 13:55:50.867759 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:55:50 crc kubenswrapper[4717]: I1007 13:55:50.869979 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 07 13:55:50 crc kubenswrapper[4717]: I1007 13:55:50.870068 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 07 13:55:50 crc kubenswrapper[4717]: I1007 13:55:50.870092 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 07 13:55:50 crc kubenswrapper[4717]: I1007 13:55:50.870092 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 07 13:55:50 crc kubenswrapper[4717]: I1007 13:55:50.870769 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 07 13:55:50 crc kubenswrapper[4717]: I1007 13:55:50.871582 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.141734 4717 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.177449 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.178437 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.178520 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4w2xj"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.178933 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.180591 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.181198 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.182211 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.182660 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.183721 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.184373 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.186357 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.186728 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.199084 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.199147 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.202177 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.202501 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-k6dvg"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.202851 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.203194 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.203529 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.204150 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.204245 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.204727 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.216268 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.216701 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.216783 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.216910 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.216929 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.216972 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217057 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217145 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217295 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217332 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217433 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217566 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217608 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217696 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217823 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217841 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217152 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.217995 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.218072 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.216907 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.218132 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.218223 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b55vq"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.218475 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.218699 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.218800 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.220323 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.220412 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.220527 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cr6vm"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.220711 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.222451 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.222511 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lpb4"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.225469 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncngn"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.225771 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.225524 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.226078 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-q2jhk"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.225665 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.225684 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.225705 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.225761 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.225858 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.226743 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.227201 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.228408 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.228575 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.229026 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fbrgn"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.229511 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h8764"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.229575 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.229679 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.229722 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.229912 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.230140 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.230488 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.230788 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.231023 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h8764" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.231475 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-79gpc"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.231817 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.232104 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-497gd"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.232589 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.234111 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.234654 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t9hbf"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.234955 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.235101 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.235751 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.236067 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.236342 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.236394 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.236418 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.236725 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.236982 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.237236 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.248159 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.252129 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.252377 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.254856 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.254898 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.255032 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.255145 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.255223 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.255312 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.255448 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.255722 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.257691 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.258456 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.258640 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.259471 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.260627 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.279786 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cs9pq\" (UID: \"25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.279839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/78d9c154-e844-48ff-80e3-89d1d17f1343-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.279911 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-oauth-config\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.279928 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-trusted-ca-bundle\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.279946 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7547f514-4e34-4e57-ac67-a4d57b1e7e18-serving-cert\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.279970 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/78d9c154-e844-48ff-80e3-89d1d17f1343-etcd-client\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.279987 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d20fff-a312-47d5-a211-5bd15a65a26c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vl9lp\" (UID: \"d6d20fff-a312-47d5-a211-5bd15a65a26c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280028 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78d9c154-e844-48ff-80e3-89d1d17f1343-serving-cert\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280053 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a33ba0-7f01-465d-b359-4904d6b7a769-config\") pod \"openshift-apiserver-operator-796bbdcf4f-c4s7n\" (UID: \"01a33ba0-7f01-465d-b359-4904d6b7a769\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280073 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pl49\" (UniqueName: \"kubernetes.io/projected/01a33ba0-7f01-465d-b359-4904d6b7a769-kube-api-access-4pl49\") pod \"openshift-apiserver-operator-796bbdcf4f-c4s7n\" (UID: \"01a33ba0-7f01-465d-b359-4904d6b7a769\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280094 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/678c62ea-ce55-4cbd-b400-c5316419c3a9-serving-cert\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280110 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7bq\" (UniqueName: \"kubernetes.io/projected/7547f514-4e34-4e57-ac67-a4d57b1e7e18-kube-api-access-ws7bq\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280125 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/678c62ea-ce55-4cbd-b400-c5316419c3a9-config\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280139 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d20fff-a312-47d5-a211-5bd15a65a26c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vl9lp\" (UID: \"d6d20fff-a312-47d5-a211-5bd15a65a26c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/78d9c154-e844-48ff-80e3-89d1d17f1343-encryption-config\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280173 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hznzb\" (UniqueName: \"kubernetes.io/projected/678c62ea-ce55-4cbd-b400-c5316419c3a9-kube-api-access-hznzb\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280193 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-config\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280209 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/678c62ea-ce55-4cbd-b400-c5316419c3a9-trusted-ca\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280222 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6m4t\" (UniqueName: \"kubernetes.io/projected/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-kube-api-access-b6m4t\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280237 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78d9c154-e844-48ff-80e3-89d1d17f1343-audit-dir\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280287 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78d9c154-e844-48ff-80e3-89d1d17f1343-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280311 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-service-ca\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280360 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp4hh\" (UniqueName: \"kubernetes.io/projected/d6d20fff-a312-47d5-a211-5bd15a65a26c-kube-api-access-bp4hh\") pod \"openshift-controller-manager-operator-756b6f6bc6-vl9lp\" (UID: \"d6d20fff-a312-47d5-a211-5bd15a65a26c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280377 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-serving-cert\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280393 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78d9c154-e844-48ff-80e3-89d1d17f1343-audit-policies\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280504 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbkq8\" (UniqueName: \"kubernetes.io/projected/25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a-kube-api-access-wbkq8\") pod \"cluster-samples-operator-665b6dd947-cs9pq\" (UID: \"25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280527 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-config\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280540 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-oauth-serving-cert\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280553 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdm7t\" (UniqueName: \"kubernetes.io/projected/78d9c154-e844-48ff-80e3-89d1d17f1343-kube-api-access-kdm7t\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280567 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-client-ca\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.280580 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01a33ba0-7f01-465d-b359-4904d6b7a769-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-c4s7n\" (UID: \"01a33ba0-7f01-465d-b359-4904d6b7a769\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.282641 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.282978 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.283736 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.283916 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.284143 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.284887 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.285354 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.290182 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.290809 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.291399 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.291697 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.291870 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.292774 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.293498 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.295868 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.296173 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.296295 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.296411 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.296519 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.296559 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.296654 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.296726 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.296864 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.296899 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297016 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297079 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297112 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297315 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297345 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297424 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297441 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297514 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297637 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297872 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.297950 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tfzld"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.298602 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.298774 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.298681 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.298728 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.298747 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.298770 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.298788 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.298815 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.298887 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.299786 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.304521 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.305051 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-2v924"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.305525 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.305897 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4l69q"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.306363 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.306825 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.307037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.307234 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.307480 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.307837 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.310187 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.313332 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.314805 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.328096 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.328341 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.328636 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.330902 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.331369 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.335692 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.336214 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.337149 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2r2zr"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.337922 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2r2zr" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.339359 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-98nvq"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.340460 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.342047 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.342642 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.345155 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.364343 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.366212 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.367538 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.368186 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.368277 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t9hbf"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.368371 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.368632 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.371307 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-k6dvg"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.371365 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.372546 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gpv6z"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.373989 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cr6vm"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.374141 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gpv6z" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.375041 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.376987 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-497gd"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.378014 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b55vq"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.379255 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4w2xj"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.380097 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.380100 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-q2jhk"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381098 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fbrgn"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381506 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d20fff-a312-47d5-a211-5bd15a65a26c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vl9lp\" (UID: \"d6d20fff-a312-47d5-a211-5bd15a65a26c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381548 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f20492a2-514b-43c3-aa9b-f26b35e14a8a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nvx87\" (UID: \"f20492a2-514b-43c3-aa9b-f26b35e14a8a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381569 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78d9c154-e844-48ff-80e3-89d1d17f1343-serving-cert\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381596 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381628 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frb9c\" (UniqueName: \"kubernetes.io/projected/f20492a2-514b-43c3-aa9b-f26b35e14a8a-kube-api-access-frb9c\") pod \"package-server-manager-789f6589d5-nvx87\" (UID: \"f20492a2-514b-43c3-aa9b-f26b35e14a8a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381656 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a33ba0-7f01-465d-b359-4904d6b7a769-config\") pod \"openshift-apiserver-operator-796bbdcf4f-c4s7n\" (UID: \"01a33ba0-7f01-465d-b359-4904d6b7a769\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381670 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pl49\" (UniqueName: \"kubernetes.io/projected/01a33ba0-7f01-465d-b359-4904d6b7a769-kube-api-access-4pl49\") pod \"openshift-apiserver-operator-796bbdcf4f-c4s7n\" (UID: \"01a33ba0-7f01-465d-b359-4904d6b7a769\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381686 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381704 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rthxr\" (UniqueName: \"kubernetes.io/projected/88668bb1-9d3d-4761-a7a2-07a57d489243-kube-api-access-rthxr\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381719 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d6de19-34a7-495c-be2e-71b777330cee-service-ca-bundle\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381734 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/678c62ea-ce55-4cbd-b400-c5316419c3a9-serving-cert\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381750 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7bq\" (UniqueName: \"kubernetes.io/projected/7547f514-4e34-4e57-ac67-a4d57b1e7e18-kube-api-access-ws7bq\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381766 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381781 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e81bbc-e8d4-414d-8e09-d8bff498f83f-config\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381796 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/10f378f0-4c92-4b53-9454-f9d83f3cb058-machine-approver-tls\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381814 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/678c62ea-ce55-4cbd-b400-c5316419c3a9-config\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381830 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d20fff-a312-47d5-a211-5bd15a65a26c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vl9lp\" (UID: \"d6d20fff-a312-47d5-a211-5bd15a65a26c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381844 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-encryption-config\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381859 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/92e81bbc-e8d4-414d-8e09-d8bff498f83f-etcd-ca\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381873 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6ce89a1-e415-49d3-956d-5a63739c4afb-webhook-cert\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381890 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/78d9c154-e844-48ff-80e3-89d1d17f1343-encryption-config\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381903 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b6ce89a1-e415-49d3-956d-5a63739c4afb-tmpfs\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381919 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hznzb\" (UniqueName: \"kubernetes.io/projected/678c62ea-ce55-4cbd-b400-c5316419c3a9-kube-api-access-hznzb\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381935 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79cce85a-0be0-41da-87c2-66184cff4b22-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lclzf\" (UID: \"79cce85a-0be0-41da-87c2-66184cff4b22\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381951 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381967 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-config\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381981 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-config\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.381997 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/46171099-e5d7-49a0-8e63-f8b9d3f8b0d8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8qs4m\" (UID: \"46171099-e5d7-49a0-8e63-f8b9d3f8b0d8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382030 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/678c62ea-ce55-4cbd-b400-c5316419c3a9-trusted-ca\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382216 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6m4t\" (UniqueName: \"kubernetes.io/projected/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-kube-api-access-b6m4t\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78d9c154-e844-48ff-80e3-89d1d17f1343-audit-dir\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382256 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-serving-cert\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382273 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-trusted-ca-bundle\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382289 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10f378f0-4c92-4b53-9454-f9d83f3cb058-auth-proxy-config\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382304 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f378f0-4c92-4b53-9454-f9d83f3cb058-config\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382318 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7hg4\" (UniqueName: \"kubernetes.io/projected/d3fdb570-c6b1-43a3-9726-b3870e21c38d-kube-api-access-n7hg4\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382337 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78d9c154-e844-48ff-80e3-89d1d17f1343-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382378 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382394 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-service-ca\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382411 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp4hh\" (UniqueName: \"kubernetes.io/projected/d6d20fff-a312-47d5-a211-5bd15a65a26c-kube-api-access-bp4hh\") pod \"openshift-controller-manager-operator-756b6f6bc6-vl9lp\" (UID: \"d6d20fff-a312-47d5-a211-5bd15a65a26c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382426 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-policies\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382443 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382460 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-226gj\" (UniqueName: \"kubernetes.io/projected/84d6de19-34a7-495c-be2e-71b777330cee-kube-api-access-226gj\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382476 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-serving-cert\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382491 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78d9c154-e844-48ff-80e3-89d1d17f1343-audit-policies\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382506 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxskh\" (UniqueName: \"kubernetes.io/projected/71387a0e-c68c-4cda-bc81-9c78fab38a52-kube-api-access-mxskh\") pod \"migrator-59844c95c7-njglk\" (UID: \"71387a0e-c68c-4cda-bc81-9c78fab38a52\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382521 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79cce85a-0be0-41da-87c2-66184cff4b22-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lclzf\" (UID: \"79cce85a-0be0-41da-87c2-66184cff4b22\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382542 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tvv4\" (UniqueName: \"kubernetes.io/projected/df090bea-623b-4671-b682-e0880460ef04-kube-api-access-4tvv4\") pod \"openshift-config-operator-7777fb866f-b55vq\" (UID: \"df090bea-623b-4671-b682-e0880460ef04\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382558 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f85d8bb-3f65-40b1-bfd3-f878f44638ce-signing-key\") pod \"service-ca-9c57cc56f-tfzld\" (UID: \"8f85d8bb-3f65-40b1-bfd3-f878f44638ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382575 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84d6de19-34a7-495c-be2e-71b777330cee-metrics-certs\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382594 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/92e81bbc-e8d4-414d-8e09-d8bff498f83f-etcd-client\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382615 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8x6m\" (UniqueName: \"kubernetes.io/projected/df8e1457-a676-4aeb-ad71-6162115e8b58-kube-api-access-t8x6m\") pod \"dns-operator-744455d44c-h8764\" (UID: \"df8e1457-a676-4aeb-ad71-6162115e8b58\") " pod="openshift-dns-operator/dns-operator-744455d44c-h8764" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382630 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3436d60-53c5-4994-88f1-d3aa555e96bd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382643 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e81bbc-e8d4-414d-8e09-d8bff498f83f-serving-cert\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382658 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgvtj\" (UniqueName: \"kubernetes.io/projected/92e81bbc-e8d4-414d-8e09-d8bff498f83f-kube-api-access-fgvtj\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382675 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjcmv\" (UniqueName: \"kubernetes.io/projected/cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a-kube-api-access-gjcmv\") pod \"service-ca-operator-777779d784-497gd\" (UID: \"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbkq8\" (UniqueName: \"kubernetes.io/projected/25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a-kube-api-access-wbkq8\") pod \"cluster-samples-operator-665b6dd947-cs9pq\" (UID: \"25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382714 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7jfr\" (UniqueName: \"kubernetes.io/projected/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-kube-api-access-b7jfr\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382728 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3fdb570-c6b1-43a3-9726-b3870e21c38d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382745 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e68626-bdd9-4340-8d96-64d03109427a-config\") pod \"kube-controller-manager-operator-78b949d7b-mvs2v\" (UID: \"e0e68626-bdd9-4340-8d96-64d03109427a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382761 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-etcd-client\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382776 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3fdb570-c6b1-43a3-9726-b3870e21c38d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382791 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3fdb570-c6b1-43a3-9726-b3870e21c38d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382808 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-config\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks62g\" (UniqueName: \"kubernetes.io/projected/46171099-e5d7-49a0-8e63-f8b9d3f8b0d8-kube-api-access-ks62g\") pod \"control-plane-machine-set-operator-78cbb6b69f-8qs4m\" (UID: \"46171099-e5d7-49a0-8e63-f8b9d3f8b0d8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382848 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df090bea-623b-4671-b682-e0880460ef04-serving-cert\") pod \"openshift-config-operator-7777fb866f-b55vq\" (UID: \"df090bea-623b-4671-b682-e0880460ef04\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382865 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-oauth-serving-cert\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdm7t\" (UniqueName: \"kubernetes.io/projected/78d9c154-e844-48ff-80e3-89d1d17f1343-kube-api-access-kdm7t\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382896 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-node-pullsecrets\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382914 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-image-import-ca\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382935 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79cce85a-0be0-41da-87c2-66184cff4b22-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lclzf\" (UID: \"79cce85a-0be0-41da-87c2-66184cff4b22\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382958 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-client-ca\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383093 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/92e81bbc-e8d4-414d-8e09-d8bff498f83f-etcd-service-ca\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383112 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e68626-bdd9-4340-8d96-64d03109427a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mvs2v\" (UID: \"e0e68626-bdd9-4340-8d96-64d03109427a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383130 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01a33ba0-7f01-465d-b359-4904d6b7a769-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-c4s7n\" (UID: \"01a33ba0-7f01-465d-b359-4904d6b7a769\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383148 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-etcd-serving-ca\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383162 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-dir\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383186 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0e68626-bdd9-4340-8d96-64d03109427a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mvs2v\" (UID: \"e0e68626-bdd9-4340-8d96-64d03109427a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383206 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cs9pq\" (UID: \"25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383224 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t252v\" (UniqueName: \"kubernetes.io/projected/10f378f0-4c92-4b53-9454-f9d83f3cb058-kube-api-access-t252v\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383238 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f85d8bb-3f65-40b1-bfd3-f878f44638ce-signing-cabundle\") pod \"service-ca-9c57cc56f-tfzld\" (UID: \"8f85d8bb-3f65-40b1-bfd3-f878f44638ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383252 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/84d6de19-34a7-495c-be2e-71b777330cee-default-certificate\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383267 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-audit\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383281 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3436d60-53c5-4994-88f1-d3aa555e96bd-config\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383295 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbkxs\" (UniqueName: \"kubernetes.io/projected/b6ce89a1-e415-49d3-956d-5a63739c4afb-kube-api-access-lbkxs\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383312 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55c7q\" (UniqueName: \"kubernetes.io/projected/c3436d60-53c5-4994-88f1-d3aa555e96bd-kube-api-access-55c7q\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383327 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383344 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/df090bea-623b-4671-b682-e0880460ef04-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b55vq\" (UID: \"df090bea-623b-4671-b682-e0880460ef04\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/df8e1457-a676-4aeb-ad71-6162115e8b58-metrics-tls\") pod \"dns-operator-744455d44c-h8764\" (UID: \"df8e1457-a676-4aeb-ad71-6162115e8b58\") " pod="openshift-dns-operator/dns-operator-744455d44c-h8764" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383398 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383413 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a-serving-cert\") pod \"service-ca-operator-777779d784-497gd\" (UID: \"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383472 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a-config\") pod \"service-ca-operator-777779d784-497gd\" (UID: \"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.382978 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383521 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.383555 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78d9c154-e844-48ff-80e3-89d1d17f1343-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.384135 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01a33ba0-7f01-465d-b359-4904d6b7a769-config\") pod \"openshift-apiserver-operator-796bbdcf4f-c4s7n\" (UID: \"01a33ba0-7f01-465d-b359-4904d6b7a769\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.384188 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h8764"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.384366 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-config\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.385021 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-oauth-config\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.385050 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/78d9c154-e844-48ff-80e3-89d1d17f1343-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.385074 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6m9q\" (UniqueName: \"kubernetes.io/projected/8f85d8bb-3f65-40b1-bfd3-f878f44638ce-kube-api-access-x6m9q\") pod \"service-ca-9c57cc56f-tfzld\" (UID: \"8f85d8bb-3f65-40b1-bfd3-f878f44638ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.385095 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.385164 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6ce89a1-e415-49d3-956d-5a63739c4afb-apiservice-cert\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.385234 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-trusted-ca-bundle\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.385859 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7547f514-4e34-4e57-ac67-a4d57b1e7e18-serving-cert\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.385930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-audit-dir\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.385947 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3436d60-53c5-4994-88f1-d3aa555e96bd-images\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.385952 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78d9c154-e844-48ff-80e3-89d1d17f1343-audit-dir\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.386002 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78d9c154-e844-48ff-80e3-89d1d17f1343-audit-policies\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.386151 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncngn"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.386229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/678c62ea-ce55-4cbd-b400-c5316419c3a9-config\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.386558 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-oauth-serving-cert\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.386836 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6d20fff-a312-47d5-a211-5bd15a65a26c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vl9lp\" (UID: \"d6d20fff-a312-47d5-a211-5bd15a65a26c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.386900 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/78d9c154-e844-48ff-80e3-89d1d17f1343-etcd-client\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.386923 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-client-ca\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.386936 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.387146 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/78d9c154-e844-48ff-80e3-89d1d17f1343-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.387222 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-79gpc"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.387288 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-trusted-ca-bundle\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.387328 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/84d6de19-34a7-495c-be2e-71b777330cee-stats-auth\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.387728 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/678c62ea-ce55-4cbd-b400-c5316419c3a9-trusted-ca\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.387856 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-service-ca\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.388364 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.388511 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-config\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.388668 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78d9c154-e844-48ff-80e3-89d1d17f1343-serving-cert\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.388898 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01a33ba0-7f01-465d-b359-4904d6b7a769-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-c4s7n\" (UID: \"01a33ba0-7f01-465d-b359-4904d6b7a769\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.389434 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.389603 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/678c62ea-ce55-4cbd-b400-c5316419c3a9-serving-cert\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.390515 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.391522 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.394088 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.394126 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.394841 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-oauth-config\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.394955 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-285fj"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.395346 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7547f514-4e34-4e57-ac67-a4d57b1e7e18-serving-cert\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.395862 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/78d9c154-e844-48ff-80e3-89d1d17f1343-encryption-config\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.396035 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jjtwt"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.396562 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cs9pq\" (UID: \"25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.396836 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.396896 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.397133 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-98nvq"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.398105 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tfzld"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.399435 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gpv6z"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.399954 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/78d9c154-e844-48ff-80e3-89d1d17f1343-etcd-client\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.400713 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4l69q"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.403436 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.408080 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.408140 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.409297 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.409742 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-serving-cert\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.410094 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.411217 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lpb4"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.411762 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d6d20fff-a312-47d5-a211-5bd15a65a26c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vl9lp\" (UID: \"d6d20fff-a312-47d5-a211-5bd15a65a26c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.412484 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.414511 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2r2zr"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.415580 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.416644 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jjtwt"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.419114 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.426140 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b8ftz"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.427460 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b8ftz"] Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.427571 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b8ftz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.427575 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.440597 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.461437 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.481622 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f20492a2-514b-43c3-aa9b-f26b35e14a8a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nvx87\" (UID: \"f20492a2-514b-43c3-aa9b-f26b35e14a8a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488239 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488265 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frb9c\" (UniqueName: \"kubernetes.io/projected/f20492a2-514b-43c3-aa9b-f26b35e14a8a-kube-api-access-frb9c\") pod \"package-server-manager-789f6589d5-nvx87\" (UID: \"f20492a2-514b-43c3-aa9b-f26b35e14a8a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488330 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rthxr\" (UniqueName: \"kubernetes.io/projected/88668bb1-9d3d-4761-a7a2-07a57d489243-kube-api-access-rthxr\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488365 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488392 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e81bbc-e8d4-414d-8e09-d8bff498f83f-config\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488416 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d6de19-34a7-495c-be2e-71b777330cee-service-ca-bundle\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488443 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-encryption-config\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488471 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/10f378f0-4c92-4b53-9454-f9d83f3cb058-machine-approver-tls\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488495 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/92e81bbc-e8d4-414d-8e09-d8bff498f83f-etcd-ca\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488518 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6ce89a1-e415-49d3-956d-5a63739c4afb-webhook-cert\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488540 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b6ce89a1-e415-49d3-956d-5a63739c4afb-tmpfs\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79cce85a-0be0-41da-87c2-66184cff4b22-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lclzf\" (UID: \"79cce85a-0be0-41da-87c2-66184cff4b22\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488612 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-config\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488639 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/46171099-e5d7-49a0-8e63-f8b9d3f8b0d8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8qs4m\" (UID: \"46171099-e5d7-49a0-8e63-f8b9d3f8b0d8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488665 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488691 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-serving-cert\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10f378f0-4c92-4b53-9454-f9d83f3cb058-auth-proxy-config\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488746 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f378f0-4c92-4b53-9454-f9d83f3cb058-config\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488773 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-trusted-ca-bundle\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488801 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488826 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7hg4\" (UniqueName: \"kubernetes.io/projected/d3fdb570-c6b1-43a3-9726-b3870e21c38d-kube-api-access-n7hg4\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488853 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488896 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-policies\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488925 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488951 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxskh\" (UniqueName: \"kubernetes.io/projected/71387a0e-c68c-4cda-bc81-9c78fab38a52-kube-api-access-mxskh\") pod \"migrator-59844c95c7-njglk\" (UID: \"71387a0e-c68c-4cda-bc81-9c78fab38a52\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.488977 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79cce85a-0be0-41da-87c2-66184cff4b22-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lclzf\" (UID: \"79cce85a-0be0-41da-87c2-66184cff4b22\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489021 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-226gj\" (UniqueName: \"kubernetes.io/projected/84d6de19-34a7-495c-be2e-71b777330cee-kube-api-access-226gj\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489058 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tvv4\" (UniqueName: \"kubernetes.io/projected/df090bea-623b-4671-b682-e0880460ef04-kube-api-access-4tvv4\") pod \"openshift-config-operator-7777fb866f-b55vq\" (UID: \"df090bea-623b-4671-b682-e0880460ef04\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489084 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f85d8bb-3f65-40b1-bfd3-f878f44638ce-signing-key\") pod \"service-ca-9c57cc56f-tfzld\" (UID: \"8f85d8bb-3f65-40b1-bfd3-f878f44638ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489107 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84d6de19-34a7-495c-be2e-71b777330cee-metrics-certs\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489163 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8x6m\" (UniqueName: \"kubernetes.io/projected/df8e1457-a676-4aeb-ad71-6162115e8b58-kube-api-access-t8x6m\") pod \"dns-operator-744455d44c-h8764\" (UID: \"df8e1457-a676-4aeb-ad71-6162115e8b58\") " pod="openshift-dns-operator/dns-operator-744455d44c-h8764" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489193 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3436d60-53c5-4994-88f1-d3aa555e96bd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489215 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e81bbc-e8d4-414d-8e09-d8bff498f83f-serving-cert\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489236 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/92e81bbc-e8d4-414d-8e09-d8bff498f83f-etcd-client\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489262 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgvtj\" (UniqueName: \"kubernetes.io/projected/92e81bbc-e8d4-414d-8e09-d8bff498f83f-kube-api-access-fgvtj\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489264 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92e81bbc-e8d4-414d-8e09-d8bff498f83f-config\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489289 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7jfr\" (UniqueName: \"kubernetes.io/projected/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-kube-api-access-b7jfr\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489336 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3fdb570-c6b1-43a3-9726-b3870e21c38d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489370 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e68626-bdd9-4340-8d96-64d03109427a-config\") pod \"kube-controller-manager-operator-78b949d7b-mvs2v\" (UID: \"e0e68626-bdd9-4340-8d96-64d03109427a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489401 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjcmv\" (UniqueName: \"kubernetes.io/projected/cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a-kube-api-access-gjcmv\") pod \"service-ca-operator-777779d784-497gd\" (UID: \"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489420 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b6ce89a1-e415-49d3-956d-5a63739c4afb-tmpfs\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489456 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3fdb570-c6b1-43a3-9726-b3870e21c38d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489487 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-etcd-client\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489516 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks62g\" (UniqueName: \"kubernetes.io/projected/46171099-e5d7-49a0-8e63-f8b9d3f8b0d8-kube-api-access-ks62g\") pod \"control-plane-machine-set-operator-78cbb6b69f-8qs4m\" (UID: \"46171099-e5d7-49a0-8e63-f8b9d3f8b0d8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489524 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/92e81bbc-e8d4-414d-8e09-d8bff498f83f-etcd-ca\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489542 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3fdb570-c6b1-43a3-9726-b3870e21c38d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489574 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-node-pullsecrets\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489603 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-image-import-ca\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489629 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79cce85a-0be0-41da-87c2-66184cff4b22-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lclzf\" (UID: \"79cce85a-0be0-41da-87c2-66184cff4b22\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489652 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df090bea-623b-4671-b682-e0880460ef04-serving-cert\") pod \"openshift-config-operator-7777fb866f-b55vq\" (UID: \"df090bea-623b-4671-b682-e0880460ef04\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489687 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e68626-bdd9-4340-8d96-64d03109427a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mvs2v\" (UID: \"e0e68626-bdd9-4340-8d96-64d03109427a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489719 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/92e81bbc-e8d4-414d-8e09-d8bff498f83f-etcd-service-ca\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489740 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-etcd-serving-ca\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489760 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-dir\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489782 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0e68626-bdd9-4340-8d96-64d03109427a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mvs2v\" (UID: \"e0e68626-bdd9-4340-8d96-64d03109427a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489811 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t252v\" (UniqueName: \"kubernetes.io/projected/10f378f0-4c92-4b53-9454-f9d83f3cb058-kube-api-access-t252v\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f85d8bb-3f65-40b1-bfd3-f878f44638ce-signing-cabundle\") pod \"service-ca-9c57cc56f-tfzld\" (UID: \"8f85d8bb-3f65-40b1-bfd3-f878f44638ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489858 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/84d6de19-34a7-495c-be2e-71b777330cee-default-certificate\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489885 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3436d60-53c5-4994-88f1-d3aa555e96bd-config\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489907 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbkxs\" (UniqueName: \"kubernetes.io/projected/b6ce89a1-e415-49d3-956d-5a63739c4afb-kube-api-access-lbkxs\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489960 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-audit\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.489987 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/df090bea-623b-4671-b682-e0880460ef04-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b55vq\" (UID: \"df090bea-623b-4671-b682-e0880460ef04\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490031 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55c7q\" (UniqueName: \"kubernetes.io/projected/c3436d60-53c5-4994-88f1-d3aa555e96bd-kube-api-access-55c7q\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490079 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490101 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/df8e1457-a676-4aeb-ad71-6162115e8b58-metrics-tls\") pod \"dns-operator-744455d44c-h8764\" (UID: \"df8e1457-a676-4aeb-ad71-6162115e8b58\") " pod="openshift-dns-operator/dns-operator-744455d44c-h8764" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490123 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490147 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a-serving-cert\") pod \"service-ca-operator-777779d784-497gd\" (UID: \"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490173 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a-config\") pod \"service-ca-operator-777779d784-497gd\" (UID: \"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490225 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6ce89a1-e415-49d3-956d-5a63739c4afb-apiservice-cert\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490249 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6m9q\" (UniqueName: \"kubernetes.io/projected/8f85d8bb-3f65-40b1-bfd3-f878f44638ce-kube-api-access-x6m9q\") pod \"service-ca-9c57cc56f-tfzld\" (UID: \"8f85d8bb-3f65-40b1-bfd3-f878f44638ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-audit-dir\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490316 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3436d60-53c5-4994-88f1-d3aa555e96bd-images\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490339 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.490362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/84d6de19-34a7-495c-be2e-71b777330cee-stats-auth\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.491207 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-config\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.491239 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-encryption-config\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.491761 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-trusted-ca-bundle\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.491862 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/df090bea-623b-4671-b682-e0880460ef04-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b55vq\" (UID: \"df090bea-623b-4671-b682-e0880460ef04\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.491922 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-audit\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.492163 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79cce85a-0be0-41da-87c2-66184cff4b22-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lclzf\" (UID: \"79cce85a-0be0-41da-87c2-66184cff4b22\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.492320 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-node-pullsecrets\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.492425 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-etcd-serving-ca\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.492675 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d3fdb570-c6b1-43a3-9726-b3870e21c38d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.492692 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3436d60-53c5-4994-88f1-d3aa555e96bd-config\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.493102 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/92e81bbc-e8d4-414d-8e09-d8bff498f83f-etcd-service-ca\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.493255 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-image-import-ca\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.493619 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92e81bbc-e8d4-414d-8e09-d8bff498f83f-serving-cert\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.493750 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-audit-dir\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.493263 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-dir\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.494000 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79cce85a-0be0-41da-87c2-66184cff4b22-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lclzf\" (UID: \"79cce85a-0be0-41da-87c2-66184cff4b22\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.494032 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-serving-cert\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.494133 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d3fdb570-c6b1-43a3-9726-b3870e21c38d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.494198 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a-config\") pod \"service-ca-operator-777779d784-497gd\" (UID: \"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.494525 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c3436d60-53c5-4994-88f1-d3aa555e96bd-images\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.495105 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/92e81bbc-e8d4-414d-8e09-d8bff498f83f-etcd-client\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.495191 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/df8e1457-a676-4aeb-ad71-6162115e8b58-metrics-tls\") pod \"dns-operator-744455d44c-h8764\" (UID: \"df8e1457-a676-4aeb-ad71-6162115e8b58\") " pod="openshift-dns-operator/dns-operator-744455d44c-h8764" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.495569 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a-serving-cert\") pod \"service-ca-operator-777779d784-497gd\" (UID: \"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.496585 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c3436d60-53c5-4994-88f1-d3aa555e96bd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.496682 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df090bea-623b-4671-b682-e0880460ef04-serving-cert\") pod \"openshift-config-operator-7777fb866f-b55vq\" (UID: \"df090bea-623b-4671-b682-e0880460ef04\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.497491 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-etcd-client\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.501079 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.520912 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.532103 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f20492a2-514b-43c3-aa9b-f26b35e14a8a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-nvx87\" (UID: \"f20492a2-514b-43c3-aa9b-f26b35e14a8a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.540596 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.560455 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.580625 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.600679 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.611566 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/46171099-e5d7-49a0-8e63-f8b9d3f8b0d8-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8qs4m\" (UID: \"46171099-e5d7-49a0-8e63-f8b9d3f8b0d8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.620206 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.640302 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.661122 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.672189 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/10f378f0-4c92-4b53-9454-f9d83f3cb058-machine-approver-tls\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.680506 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.681977 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/10f378f0-4c92-4b53-9454-f9d83f3cb058-auth-proxy-config\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.701200 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.711226 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10f378f0-4c92-4b53-9454-f9d83f3cb058-config\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.720792 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.739990 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.760761 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.780489 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.800820 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.826621 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.860235 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.862046 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0e68626-bdd9-4340-8d96-64d03109427a-config\") pod \"kube-controller-manager-operator-78b949d7b-mvs2v\" (UID: \"e0e68626-bdd9-4340-8d96-64d03109427a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.881214 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.881904 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-policies\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.903155 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.921573 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.925903 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0e68626-bdd9-4340-8d96-64d03109427a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mvs2v\" (UID: \"e0e68626-bdd9-4340-8d96-64d03109427a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.940245 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.960975 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.980312 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 07 13:55:58 crc kubenswrapper[4717]: I1007 13:55:58.984708 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.000648 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.024530 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.034605 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.041062 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.042239 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.060254 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.070049 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.087498 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.093099 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.100440 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.120960 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.140517 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.161071 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.175650 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8f85d8bb-3f65-40b1-bfd3-f878f44638ce-signing-key\") pod \"service-ca-9c57cc56f-tfzld\" (UID: \"8f85d8bb-3f65-40b1-bfd3-f878f44638ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.181081 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.200577 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.203054 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8f85d8bb-3f65-40b1-bfd3-f878f44638ce-signing-cabundle\") pod \"service-ca-9c57cc56f-tfzld\" (UID: \"8f85d8bb-3f65-40b1-bfd3-f878f44638ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.220419 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.240284 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.261058 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.280717 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.287376 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b6ce89a1-e415-49d3-956d-5a63739c4afb-apiservice-cert\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.291319 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b6ce89a1-e415-49d3-956d-5a63739c4afb-webhook-cert\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.301369 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.308558 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.319368 4717 request.go:700] Waited for 1.011734653s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-dockercfg-zdk86&limit=500&resourceVersion=0 Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.320872 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.340782 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.361488 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.365832 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/84d6de19-34a7-495c-be2e-71b777330cee-default-certificate\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.380872 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.393932 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/84d6de19-34a7-495c-be2e-71b777330cee-stats-auth\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.412802 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.421464 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.423705 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.433943 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84d6de19-34a7-495c-be2e-71b777330cee-metrics-certs\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.440479 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.449597 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d6de19-34a7-495c-be2e-71b777330cee-service-ca-bundle\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.461595 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.468173 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.481152 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 07 13:55:59 crc kubenswrapper[4717]: E1007 13:55:59.488705 4717 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Oct 07 13:55:59 crc kubenswrapper[4717]: E1007 13:55:59.488777 4717 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-provider-selection: failed to sync secret cache: timed out waiting for the condition Oct 07 13:55:59 crc kubenswrapper[4717]: E1007 13:55:59.488913 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-session podName:88668bb1-9d3d-4761-a7a2-07a57d489243 nodeName:}" failed. No retries permitted until 2025-10-07 13:55:59.988891705 +0000 UTC m=+141.816817497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-session") pod "oauth-openshift-558db77b4-4l69q" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:55:59 crc kubenswrapper[4717]: E1007 13:55:59.489083 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-provider-selection podName:88668bb1-9d3d-4761-a7a2-07a57d489243 nodeName:}" failed. No retries permitted until 2025-10-07 13:55:59.989036209 +0000 UTC m=+141.816962041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-provider-selection" (UniqueName: "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-provider-selection") pod "oauth-openshift-558db77b4-4l69q" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:55:59 crc kubenswrapper[4717]: E1007 13:55:59.492466 4717 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Oct 07 13:55:59 crc kubenswrapper[4717]: E1007 13:55:59.492545 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-error podName:88668bb1-9d3d-4761-a7a2-07a57d489243 nodeName:}" failed. No retries permitted until 2025-10-07 13:55:59.992524743 +0000 UTC m=+141.820450575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-error") pod "oauth-openshift-558db77b4-4l69q" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243") : failed to sync secret cache: timed out waiting for the condition Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.499882 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.521680 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.541147 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.561815 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.582813 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.600982 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.640844 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.660677 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.680620 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.701051 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.721370 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.740256 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.760144 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.780954 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.800867 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.820294 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.840280 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.860130 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.879959 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.901230 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.921661 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.940135 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.960731 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 07 13:55:59 crc kubenswrapper[4717]: I1007 13:55:59.980832 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.001891 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.014603 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.014647 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.014881 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.017323 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.017884 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.017904 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.020140 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.057107 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbkq8\" (UniqueName: \"kubernetes.io/projected/25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a-kube-api-access-wbkq8\") pod \"cluster-samples-operator-665b6dd947-cs9pq\" (UID: \"25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.076122 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pl49\" (UniqueName: \"kubernetes.io/projected/01a33ba0-7f01-465d-b359-4904d6b7a769-kube-api-access-4pl49\") pod \"openshift-apiserver-operator-796bbdcf4f-c4s7n\" (UID: \"01a33ba0-7f01-465d-b359-4904d6b7a769\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.093826 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp4hh\" (UniqueName: \"kubernetes.io/projected/d6d20fff-a312-47d5-a211-5bd15a65a26c-kube-api-access-bp4hh\") pod \"openshift-controller-manager-operator-756b6f6bc6-vl9lp\" (UID: \"d6d20fff-a312-47d5-a211-5bd15a65a26c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.114553 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdm7t\" (UniqueName: \"kubernetes.io/projected/78d9c154-e844-48ff-80e3-89d1d17f1343-kube-api-access-kdm7t\") pod \"apiserver-7bbb656c7d-qb2bx\" (UID: \"78d9c154-e844-48ff-80e3-89d1d17f1343\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.118419 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.138025 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7bq\" (UniqueName: \"kubernetes.io/projected/7547f514-4e34-4e57-ac67-a4d57b1e7e18-kube-api-access-ws7bq\") pod \"route-controller-manager-6576b87f9c-6lfbq\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.153983 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hznzb\" (UniqueName: \"kubernetes.io/projected/678c62ea-ce55-4cbd-b400-c5316419c3a9-kube-api-access-hznzb\") pod \"console-operator-58897d9998-4w2xj\" (UID: \"678c62ea-ce55-4cbd-b400-c5316419c3a9\") " pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.178497 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6m4t\" (UniqueName: \"kubernetes.io/projected/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-kube-api-access-b6m4t\") pod \"console-f9d7485db-k6dvg\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.180863 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.201457 4717 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.223081 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.240407 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.261534 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.280706 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.300488 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.300779 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.306629 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.320378 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.320386 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.336777 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.338708 4717 request.go:700] Waited for 1.910802918s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.340149 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.349445 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.360178 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.396898 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.425356 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rthxr\" (UniqueName: \"kubernetes.io/projected/88668bb1-9d3d-4761-a7a2-07a57d489243-kube-api-access-rthxr\") pod \"oauth-openshift-558db77b4-4l69q\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.436390 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.444162 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7jfr\" (UniqueName: \"kubernetes.io/projected/e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad-kube-api-access-b7jfr\") pod \"apiserver-76f77b778f-q2jhk\" (UID: \"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad\") " pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.447597 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frb9c\" (UniqueName: \"kubernetes.io/projected/f20492a2-514b-43c3-aa9b-f26b35e14a8a-kube-api-access-frb9c\") pod \"package-server-manager-789f6589d5-nvx87\" (UID: \"f20492a2-514b-43c3-aa9b-f26b35e14a8a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.455150 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79cce85a-0be0-41da-87c2-66184cff4b22-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lclzf\" (UID: \"79cce85a-0be0-41da-87c2-66184cff4b22\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.477735 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8x6m\" (UniqueName: \"kubernetes.io/projected/df8e1457-a676-4aeb-ad71-6162115e8b58-kube-api-access-t8x6m\") pod \"dns-operator-744455d44c-h8764\" (UID: \"df8e1457-a676-4aeb-ad71-6162115e8b58\") " pod="openshift-dns-operator/dns-operator-744455d44c-h8764" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.495722 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7hg4\" (UniqueName: \"kubernetes.io/projected/d3fdb570-c6b1-43a3-9726-b3870e21c38d-kube-api-access-n7hg4\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.519972 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-226gj\" (UniqueName: \"kubernetes.io/projected/84d6de19-34a7-495c-be2e-71b777330cee-kube-api-access-226gj\") pod \"router-default-5444994796-2v924\" (UID: \"84d6de19-34a7-495c-be2e-71b777330cee\") " pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.527295 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4w2xj"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.538777 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tvv4\" (UniqueName: \"kubernetes.io/projected/df090bea-623b-4671-b682-e0880460ef04-kube-api-access-4tvv4\") pod \"openshift-config-operator-7777fb866f-b55vq\" (UID: \"df090bea-623b-4671-b682-e0880460ef04\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:56:00 crc kubenswrapper[4717]: W1007 13:56:00.541445 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod678c62ea_ce55_4cbd_b400_c5316419c3a9.slice/crio-20e326f13734f08194e77abca4b3b3a509154a6ccc7dbd4a26848215293e1887 WatchSource:0}: Error finding container 20e326f13734f08194e77abca4b3b3a509154a6ccc7dbd4a26848215293e1887: Status 404 returned error can't find the container with id 20e326f13734f08194e77abca4b3b3a509154a6ccc7dbd4a26848215293e1887 Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.545580 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.562425 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.564313 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks62g\" (UniqueName: \"kubernetes.io/projected/46171099-e5d7-49a0-8e63-f8b9d3f8b0d8-kube-api-access-ks62g\") pod \"control-plane-machine-set-operator-78cbb6b69f-8qs4m\" (UID: \"46171099-e5d7-49a0-8e63-f8b9d3f8b0d8\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.567895 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-h8764" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.581536 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjcmv\" (UniqueName: \"kubernetes.io/projected/cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a-kube-api-access-gjcmv\") pod \"service-ca-operator-777779d784-497gd\" (UID: \"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.586335 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" event={"ID":"d6d20fff-a312-47d5-a211-5bd15a65a26c","Type":"ContainerStarted","Data":"8254120fcd2757c13f9f32019f88611ffeb5277cb5f0312d8e447e6dba43f79e"} Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.586381 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" event={"ID":"d6d20fff-a312-47d5-a211-5bd15a65a26c","Type":"ContainerStarted","Data":"27ba908177b7c68d927e9110267414291586e84765c78cfadac46dfce5dfad43"} Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.587686 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.588056 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4w2xj" event={"ID":"678c62ea-ce55-4cbd-b400-c5316419c3a9","Type":"ContainerStarted","Data":"20e326f13734f08194e77abca4b3b3a509154a6ccc7dbd4a26848215293e1887"} Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.599729 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55c7q\" (UniqueName: \"kubernetes.io/projected/c3436d60-53c5-4994-88f1-d3aa555e96bd-kube-api-access-55c7q\") pod \"machine-api-operator-5694c8668f-79gpc\" (UID: \"c3436d60-53c5-4994-88f1-d3aa555e96bd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.619966 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.621054 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3fdb570-c6b1-43a3-9726-b3870e21c38d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v2qhz\" (UID: \"d3fdb570-c6b1-43a3-9726-b3870e21c38d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.629586 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.641265 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbkxs\" (UniqueName: \"kubernetes.io/projected/b6ce89a1-e415-49d3-956d-5a63739c4afb-kube-api-access-lbkxs\") pod \"packageserver-d55dfcdfc-8cbq8\" (UID: \"b6ce89a1-e415-49d3-956d-5a63739c4afb\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.652063 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.658315 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.662782 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxskh\" (UniqueName: \"kubernetes.io/projected/71387a0e-c68c-4cda-bc81-9c78fab38a52-kube-api-access-mxskh\") pod \"migrator-59844c95c7-njglk\" (UID: \"71387a0e-c68c-4cda-bc81-9c78fab38a52\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.665594 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.672945 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.681552 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e0e68626-bdd9-4340-8d96-64d03109427a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mvs2v\" (UID: \"e0e68626-bdd9-4340-8d96-64d03109427a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.697439 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.698807 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.701998 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t252v\" (UniqueName: \"kubernetes.io/projected/10f378f0-4c92-4b53-9454-f9d83f3cb058-kube-api-access-t252v\") pod \"machine-approver-56656f9798-d79w8\" (UID: \"10f378f0-4c92-4b53-9454-f9d83f3cb058\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.705748 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.716320 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6m9q\" (UniqueName: \"kubernetes.io/projected/8f85d8bb-3f65-40b1-bfd3-f878f44638ce-kube-api-access-x6m9q\") pod \"service-ca-9c57cc56f-tfzld\" (UID: \"8f85d8bb-3f65-40b1-bfd3-f878f44638ce\") " pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.719290 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-k6dvg"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.753198 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgvtj\" (UniqueName: \"kubernetes.io/projected/92e81bbc-e8d4-414d-8e09-d8bff498f83f-kube-api-access-fgvtj\") pod \"etcd-operator-b45778765-fbrgn\" (UID: \"92e81bbc-e8d4-414d-8e09-d8bff498f83f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.774278 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:56:00 crc kubenswrapper[4717]: W1007 13:56:00.789667 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd44616b_54d3_418a_ac99_9b7ff3c4d2d9.slice/crio-0e954b8d57150b807c4c44d7a6baf1dc613cae375c2afb07f720f4c04b5eff8d WatchSource:0}: Error finding container 0e954b8d57150b807c4c44d7a6baf1dc613cae375c2afb07f720f4c04b5eff8d: Status 404 returned error can't find the container with id 0e954b8d57150b807c4c44d7a6baf1dc613cae375c2afb07f720f4c04b5eff8d Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.825572 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-bound-sa-token\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.825941 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dldt5\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-kube-api-access-dldt5\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.825970 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74lxc\" (UniqueName: \"kubernetes.io/projected/a24414df-b777-4ee7-8d31-41e802737e89-kube-api-access-74lxc\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.825993 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpzbs\" (UniqueName: \"kubernetes.io/projected/67e983f2-2f6b-40c2-9690-bf0199c02f04-kube-api-access-zpzbs\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826028 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51567150-6df0-4b38-aec7-d9e78df693dc-trusted-ca\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826045 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xl5v\" (UniqueName: \"kubernetes.io/projected/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-kube-api-access-5xl5v\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826059 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24414df-b777-4ee7-8d31-41e802737e89-serving-cert\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826076 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j9r4\" (UniqueName: \"kubernetes.io/projected/a923ae7c-d6cd-4d73-a547-27ef6581da72-kube-api-access-8j9r4\") pod \"machine-config-controller-84d6567774-mt9vs\" (UID: \"a923ae7c-d6cd-4d73-a547-27ef6581da72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826092 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-certificates\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826116 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t9hbf\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826139 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-tls\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e983f2-2f6b-40c2-9690-bf0199c02f04-serving-cert\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826177 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-config\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826196 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a923ae7c-d6cd-4d73-a547-27ef6581da72-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mt9vs\" (UID: \"a923ae7c-d6cd-4d73-a547-27ef6581da72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826214 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826231 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24414df-b777-4ee7-8d31-41e802737e89-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826252 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826273 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826303 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24414df-b777-4ee7-8d31-41e802737e89-config\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826320 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51567150-6df0-4b38-aec7-d9e78df693dc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826335 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826349 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a923ae7c-d6cd-4d73-a547-27ef6581da72-proxy-tls\") pod \"machine-config-controller-84d6567774-mt9vs\" (UID: \"a923ae7c-d6cd-4d73-a547-27ef6581da72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826364 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-proxy-tls\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826380 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826396 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk9w9\" (UniqueName: \"kubernetes.io/projected/51567150-6df0-4b38-aec7-d9e78df693dc-kube-api-access-kk9w9\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826509 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-images\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826547 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24414df-b777-4ee7-8d31-41e802737e89-service-ca-bundle\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826572 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t9hbf\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:00 crc kubenswrapper[4717]: E1007 13:56:00.826645 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:01.326632626 +0000 UTC m=+143.154558418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826871 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n44b7\" (UniqueName: \"kubernetes.io/projected/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-kube-api-access-n44b7\") pod \"marketplace-operator-79b997595-t9hbf\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826911 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-trusted-ca\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51567150-6df0-4b38-aec7-d9e78df693dc-metrics-tls\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.826956 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-client-ca\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.828195 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-h8764"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.833242 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-q2jhk"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.854263 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.858419 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.875775 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.880257 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.907981 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.928909 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929190 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j9r4\" (UniqueName: \"kubernetes.io/projected/a923ae7c-d6cd-4d73-a547-27ef6581da72-kube-api-access-8j9r4\") pod \"machine-config-controller-84d6567774-mt9vs\" (UID: \"a923ae7c-d6cd-4d73-a547-27ef6581da72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929265 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92eb792a-3195-446f-9b72-6e78a48f7b9e-config-volume\") pod \"dns-default-gpv6z\" (UID: \"92eb792a-3195-446f-9b72-6e78a48f7b9e\") " pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929297 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-certificates\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929353 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t9hbf\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929382 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj6f2\" (UniqueName: \"kubernetes.io/projected/3ad55ef1-703c-423f-9ee5-254322fd0cb2-kube-api-access-vj6f2\") pod \"multus-admission-controller-857f4d67dd-98nvq\" (UID: \"3ad55ef1-703c-423f-9ee5-254322fd0cb2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929417 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-tls\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929438 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv6hp\" (UniqueName: \"kubernetes.io/projected/93b2f0cf-f03b-48f9-9b04-036c07f435b9-kube-api-access-wv6hp\") pod \"ingress-canary-b8ftz\" (UID: \"93b2f0cf-f03b-48f9-9b04-036c07f435b9\") " pod="openshift-ingress-canary/ingress-canary-b8ftz" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929479 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f24295a8-cbdd-48d0-b48a-b73e86e53fc9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-prnk6\" (UID: \"f24295a8-cbdd-48d0-b48a-b73e86e53fc9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929520 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e983f2-2f6b-40c2-9690-bf0199c02f04-serving-cert\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929570 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-config\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929605 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a923ae7c-d6cd-4d73-a547-27ef6581da72-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mt9vs\" (UID: \"a923ae7c-d6cd-4d73-a547-27ef6581da72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929628 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mx47\" (UniqueName: \"kubernetes.io/projected/2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd-kube-api-access-8mx47\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgd7c\" (UID: \"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929673 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5c6468-8f25-4833-a33e-ffacc52cbaff-config\") pod \"kube-apiserver-operator-766d6c64bb-nwg5z\" (UID: \"9f5c6468-8f25-4833-a33e-ffacc52cbaff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929696 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-csi-data-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929740 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929808 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24414df-b777-4ee7-8d31-41e802737e89-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929869 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49hrs\" (UniqueName: \"kubernetes.io/projected/92eb792a-3195-446f-9b72-6e78a48f7b9e-kube-api-access-49hrs\") pod \"dns-default-gpv6z\" (UID: \"92eb792a-3195-446f-9b72-6e78a48f7b9e\") " pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929919 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93b2f0cf-f03b-48f9-9b04-036c07f435b9-cert\") pod \"ingress-canary-b8ftz\" (UID: \"93b2f0cf-f03b-48f9-9b04-036c07f435b9\") " pod="openshift-ingress-canary/ingress-canary-b8ftz" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.929983 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eee5a1d3-cec9-45fb-91ff-77d20bae5c80-srv-cert\") pod \"catalog-operator-68c6474976-k9vz5\" (UID: \"eee5a1d3-cec9-45fb-91ff-77d20bae5c80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930026 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-socket-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930090 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930121 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgp5\" (UniqueName: \"kubernetes.io/projected/37c53362-c687-4727-97b9-95a9a358cf5b-kube-api-access-rkgp5\") pod \"collect-profiles-29330745-kpzmq\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930145 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28k6j\" (UniqueName: \"kubernetes.io/projected/bea65594-0f14-4f94-86c9-6f3ae50daf32-kube-api-access-28k6j\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930188 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92eb792a-3195-446f-9b72-6e78a48f7b9e-metrics-tls\") pod \"dns-default-gpv6z\" (UID: \"92eb792a-3195-446f-9b72-6e78a48f7b9e\") " pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930347 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24414df-b777-4ee7-8d31-41e802737e89-config\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930392 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kclhg\" (UniqueName: \"kubernetes.io/projected/f24295a8-cbdd-48d0-b48a-b73e86e53fc9-kube-api-access-kclhg\") pod \"olm-operator-6b444d44fb-prnk6\" (UID: \"f24295a8-cbdd-48d0-b48a-b73e86e53fc9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930415 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3ad55ef1-703c-423f-9ee5-254322fd0cb2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-98nvq\" (UID: \"3ad55ef1-703c-423f-9ee5-254322fd0cb2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930437 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgd7c\" (UID: \"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930461 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs6x9\" (UniqueName: \"kubernetes.io/projected/eee5a1d3-cec9-45fb-91ff-77d20bae5c80-kube-api-access-xs6x9\") pod \"catalog-operator-68c6474976-k9vz5\" (UID: \"eee5a1d3-cec9-45fb-91ff-77d20bae5c80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930535 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/758eea7d-7bcc-4b1e-8690-485cd73090bc-certs\") pod \"machine-config-server-285fj\" (UID: \"758eea7d-7bcc-4b1e-8690-485cd73090bc\") " pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930568 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51567150-6df0-4b38-aec7-d9e78df693dc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930614 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a923ae7c-d6cd-4d73-a547-27ef6581da72-proxy-tls\") pod \"machine-config-controller-84d6567774-mt9vs\" (UID: \"a923ae7c-d6cd-4d73-a547-27ef6581da72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930666 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-proxy-tls\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930690 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f5c6468-8f25-4833-a33e-ffacc52cbaff-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nwg5z\" (UID: \"9f5c6468-8f25-4833-a33e-ffacc52cbaff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930731 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-registration-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930751 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/758eea7d-7bcc-4b1e-8690-485cd73090bc-node-bootstrap-token\") pod \"machine-config-server-285fj\" (UID: \"758eea7d-7bcc-4b1e-8690-485cd73090bc\") " pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930816 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930849 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk9w9\" (UniqueName: \"kubernetes.io/projected/51567150-6df0-4b38-aec7-d9e78df693dc-kube-api-access-kk9w9\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930874 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eee5a1d3-cec9-45fb-91ff-77d20bae5c80-profile-collector-cert\") pod \"catalog-operator-68c6474976-k9vz5\" (UID: \"eee5a1d3-cec9-45fb-91ff-77d20bae5c80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930955 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-images\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.930991 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24414df-b777-4ee7-8d31-41e802737e89-service-ca-bundle\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.931656 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24414df-b777-4ee7-8d31-41e802737e89-service-ca-bundle\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.933684 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t9hbf\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:00 crc kubenswrapper[4717]: E1007 13:56:00.936654 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:01.430999223 +0000 UTC m=+143.258925015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.936712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t9hbf\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.936792 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n44b7\" (UniqueName: \"kubernetes.io/projected/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-kube-api-access-n44b7\") pod \"marketplace-operator-79b997595-t9hbf\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.936880 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-trusted-ca\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.936908 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51567150-6df0-4b38-aec7-d9e78df693dc-metrics-tls\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.936936 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5c6468-8f25-4833-a33e-ffacc52cbaff-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nwg5z\" (UID: \"9f5c6468-8f25-4833-a33e-ffacc52cbaff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.936960 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtwr8\" (UniqueName: \"kubernetes.io/projected/757f147e-b766-4200-9a8e-c4fbad44fed0-kube-api-access-rtwr8\") pod \"downloads-7954f5f757-2r2zr\" (UID: \"757f147e-b766-4200-9a8e-c4fbad44fed0\") " pod="openshift-console/downloads-7954f5f757-2r2zr" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.936992 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-client-ca\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.937031 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-plugins-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.937096 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-bound-sa-token\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.937122 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dldt5\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-kube-api-access-dldt5\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.937213 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74lxc\" (UniqueName: \"kubernetes.io/projected/a24414df-b777-4ee7-8d31-41e802737e89-kube-api-access-74lxc\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.937269 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpzbs\" (UniqueName: \"kubernetes.io/projected/67e983f2-2f6b-40c2-9690-bf0199c02f04-kube-api-access-zpzbs\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.937295 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f24295a8-cbdd-48d0-b48a-b73e86e53fc9-srv-cert\") pod \"olm-operator-6b444d44fb-prnk6\" (UID: \"f24295a8-cbdd-48d0-b48a-b73e86e53fc9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.937367 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51567150-6df0-4b38-aec7-d9e78df693dc-trusted-ca\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.937466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xl5v\" (UniqueName: \"kubernetes.io/projected/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-kube-api-access-5xl5v\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.937493 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgd7c\" (UID: \"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.938578 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk689\" (UniqueName: \"kubernetes.io/projected/758eea7d-7bcc-4b1e-8690-485cd73090bc-kube-api-access-bk689\") pod \"machine-config-server-285fj\" (UID: \"758eea7d-7bcc-4b1e-8690-485cd73090bc\") " pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.938636 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24414df-b777-4ee7-8d31-41e802737e89-serving-cert\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.938692 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37c53362-c687-4727-97b9-95a9a358cf5b-config-volume\") pod \"collect-profiles-29330745-kpzmq\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.938716 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37c53362-c687-4727-97b9-95a9a358cf5b-secret-volume\") pod \"collect-profiles-29330745-kpzmq\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.938742 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-mountpoint-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.945305 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf"] Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.949486 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a923ae7c-d6cd-4d73-a547-27ef6581da72-proxy-tls\") pod \"machine-config-controller-84d6567774-mt9vs\" (UID: \"a923ae7c-d6cd-4d73-a547-27ef6581da72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.951205 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a923ae7c-d6cd-4d73-a547-27ef6581da72-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mt9vs\" (UID: \"a923ae7c-d6cd-4d73-a547-27ef6581da72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.953588 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e983f2-2f6b-40c2-9690-bf0199c02f04-serving-cert\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.953895 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-client-ca\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.954321 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a24414df-b777-4ee7-8d31-41e802737e89-config\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.956103 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-certificates\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.956739 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51567150-6df0-4b38-aec7-d9e78df693dc-trusted-ca\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.956835 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.956903 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-images\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.957815 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.958356 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-proxy-tls\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.959944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a24414df-b777-4ee7-8d31-41e802737e89-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.961044 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-config\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.961573 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t9hbf\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.962676 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-trusted-ca\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.965141 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/51567150-6df0-4b38-aec7-d9e78df693dc-metrics-tls\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.965540 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.967171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-tls\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.973360 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.974447 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j9r4\" (UniqueName: \"kubernetes.io/projected/a923ae7c-d6cd-4d73-a547-27ef6581da72-kube-api-access-8j9r4\") pod \"machine-config-controller-84d6567774-mt9vs\" (UID: \"a923ae7c-d6cd-4d73-a547-27ef6581da72\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.974611 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.977786 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a24414df-b777-4ee7-8d31-41e802737e89-serving-cert\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.978334 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" Oct 07 13:56:00 crc kubenswrapper[4717]: I1007 13:56:00.980084 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.009810 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74lxc\" (UniqueName: \"kubernetes.io/projected/a24414df-b777-4ee7-8d31-41e802737e89-kube-api-access-74lxc\") pod \"authentication-operator-69f744f599-cr6vm\" (UID: \"a24414df-b777-4ee7-8d31-41e802737e89\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.017973 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpzbs\" (UniqueName: \"kubernetes.io/projected/67e983f2-2f6b-40c2-9690-bf0199c02f04-kube-api-access-zpzbs\") pod \"controller-manager-879f6c89f-9lpb4\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.024345 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.040731 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49hrs\" (UniqueName: \"kubernetes.io/projected/92eb792a-3195-446f-9b72-6e78a48f7b9e-kube-api-access-49hrs\") pod \"dns-default-gpv6z\" (UID: \"92eb792a-3195-446f-9b72-6e78a48f7b9e\") " pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.040772 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93b2f0cf-f03b-48f9-9b04-036c07f435b9-cert\") pod \"ingress-canary-b8ftz\" (UID: \"93b2f0cf-f03b-48f9-9b04-036c07f435b9\") " pod="openshift-ingress-canary/ingress-canary-b8ftz" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.040793 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eee5a1d3-cec9-45fb-91ff-77d20bae5c80-srv-cert\") pod \"catalog-operator-68c6474976-k9vz5\" (UID: \"eee5a1d3-cec9-45fb-91ff-77d20bae5c80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.040814 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgp5\" (UniqueName: \"kubernetes.io/projected/37c53362-c687-4727-97b9-95a9a358cf5b-kube-api-access-rkgp5\") pod \"collect-profiles-29330745-kpzmq\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.040842 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-socket-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.040867 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28k6j\" (UniqueName: \"kubernetes.io/projected/bea65594-0f14-4f94-86c9-6f3ae50daf32-kube-api-access-28k6j\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.040899 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.040924 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92eb792a-3195-446f-9b72-6e78a48f7b9e-metrics-tls\") pod \"dns-default-gpv6z\" (UID: \"92eb792a-3195-446f-9b72-6e78a48f7b9e\") " pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.040968 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kclhg\" (UniqueName: \"kubernetes.io/projected/f24295a8-cbdd-48d0-b48a-b73e86e53fc9-kube-api-access-kclhg\") pod \"olm-operator-6b444d44fb-prnk6\" (UID: \"f24295a8-cbdd-48d0-b48a-b73e86e53fc9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.040987 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3ad55ef1-703c-423f-9ee5-254322fd0cb2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-98nvq\" (UID: \"3ad55ef1-703c-423f-9ee5-254322fd0cb2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041045 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgd7c\" (UID: \"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041068 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs6x9\" (UniqueName: \"kubernetes.io/projected/eee5a1d3-cec9-45fb-91ff-77d20bae5c80-kube-api-access-xs6x9\") pod \"catalog-operator-68c6474976-k9vz5\" (UID: \"eee5a1d3-cec9-45fb-91ff-77d20bae5c80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041090 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/758eea7d-7bcc-4b1e-8690-485cd73090bc-certs\") pod \"machine-config-server-285fj\" (UID: \"758eea7d-7bcc-4b1e-8690-485cd73090bc\") " pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041112 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f5c6468-8f25-4833-a33e-ffacc52cbaff-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nwg5z\" (UID: \"9f5c6468-8f25-4833-a33e-ffacc52cbaff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041130 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-registration-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041154 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/758eea7d-7bcc-4b1e-8690-485cd73090bc-node-bootstrap-token\") pod \"machine-config-server-285fj\" (UID: \"758eea7d-7bcc-4b1e-8690-485cd73090bc\") " pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041187 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eee5a1d3-cec9-45fb-91ff-77d20bae5c80-profile-collector-cert\") pod \"catalog-operator-68c6474976-k9vz5\" (UID: \"eee5a1d3-cec9-45fb-91ff-77d20bae5c80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041226 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5c6468-8f25-4833-a33e-ffacc52cbaff-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nwg5z\" (UID: \"9f5c6468-8f25-4833-a33e-ffacc52cbaff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041240 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4l69q"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041250 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwr8\" (UniqueName: \"kubernetes.io/projected/757f147e-b766-4200-9a8e-c4fbad44fed0-kube-api-access-rtwr8\") pod \"downloads-7954f5f757-2r2zr\" (UID: \"757f147e-b766-4200-9a8e-c4fbad44fed0\") " pod="openshift-console/downloads-7954f5f757-2r2zr" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-plugins-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041371 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f24295a8-cbdd-48d0-b48a-b73e86e53fc9-srv-cert\") pod \"olm-operator-6b444d44fb-prnk6\" (UID: \"f24295a8-cbdd-48d0-b48a-b73e86e53fc9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041399 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgd7c\" (UID: \"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041415 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk689\" (UniqueName: \"kubernetes.io/projected/758eea7d-7bcc-4b1e-8690-485cd73090bc-kube-api-access-bk689\") pod \"machine-config-server-285fj\" (UID: \"758eea7d-7bcc-4b1e-8690-485cd73090bc\") " pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041433 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37c53362-c687-4727-97b9-95a9a358cf5b-secret-volume\") pod \"collect-profiles-29330745-kpzmq\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041450 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-mountpoint-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041468 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37c53362-c687-4727-97b9-95a9a358cf5b-config-volume\") pod \"collect-profiles-29330745-kpzmq\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041487 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92eb792a-3195-446f-9b72-6e78a48f7b9e-config-volume\") pod \"dns-default-gpv6z\" (UID: \"92eb792a-3195-446f-9b72-6e78a48f7b9e\") " pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041511 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj6f2\" (UniqueName: \"kubernetes.io/projected/3ad55ef1-703c-423f-9ee5-254322fd0cb2-kube-api-access-vj6f2\") pod \"multus-admission-controller-857f4d67dd-98nvq\" (UID: \"3ad55ef1-703c-423f-9ee5-254322fd0cb2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041530 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv6hp\" (UniqueName: \"kubernetes.io/projected/93b2f0cf-f03b-48f9-9b04-036c07f435b9-kube-api-access-wv6hp\") pod \"ingress-canary-b8ftz\" (UID: \"93b2f0cf-f03b-48f9-9b04-036c07f435b9\") " pod="openshift-ingress-canary/ingress-canary-b8ftz" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041559 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f24295a8-cbdd-48d0-b48a-b73e86e53fc9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-prnk6\" (UID: \"f24295a8-cbdd-48d0-b48a-b73e86e53fc9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041583 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mx47\" (UniqueName: \"kubernetes.io/projected/2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd-kube-api-access-8mx47\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgd7c\" (UID: \"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041601 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5c6468-8f25-4833-a33e-ffacc52cbaff-config\") pod \"kube-apiserver-operator-766d6c64bb-nwg5z\" (UID: \"9f5c6468-8f25-4833-a33e-ffacc52cbaff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041618 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-csi-data-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.041753 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-csi-data-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.043491 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n44b7\" (UniqueName: \"kubernetes.io/projected/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-kube-api-access-n44b7\") pod \"marketplace-operator-79b997595-t9hbf\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.043757 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-socket-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.044125 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-plugins-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.044282 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:01.544262068 +0000 UTC m=+143.372187860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.048167 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgd7c\" (UID: \"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.048361 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-mountpoint-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.055660 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5c6468-8f25-4833-a33e-ffacc52cbaff-config\") pod \"kube-apiserver-operator-766d6c64bb-nwg5z\" (UID: \"9f5c6468-8f25-4833-a33e-ffacc52cbaff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.056272 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3ad55ef1-703c-423f-9ee5-254322fd0cb2-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-98nvq\" (UID: \"3ad55ef1-703c-423f-9ee5-254322fd0cb2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.056712 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/92eb792a-3195-446f-9b72-6e78a48f7b9e-config-volume\") pod \"dns-default-gpv6z\" (UID: \"92eb792a-3195-446f-9b72-6e78a48f7b9e\") " pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.057296 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/758eea7d-7bcc-4b1e-8690-485cd73090bc-certs\") pod \"machine-config-server-285fj\" (UID: \"758eea7d-7bcc-4b1e-8690-485cd73090bc\") " pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.057843 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bea65594-0f14-4f94-86c9-6f3ae50daf32-registration-dir\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.057861 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f24295a8-cbdd-48d0-b48a-b73e86e53fc9-srv-cert\") pod \"olm-operator-6b444d44fb-prnk6\" (UID: \"f24295a8-cbdd-48d0-b48a-b73e86e53fc9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.057990 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/92eb792a-3195-446f-9b72-6e78a48f7b9e-metrics-tls\") pod \"dns-default-gpv6z\" (UID: \"92eb792a-3195-446f-9b72-6e78a48f7b9e\") " pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.058983 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37c53362-c687-4727-97b9-95a9a358cf5b-config-volume\") pod \"collect-profiles-29330745-kpzmq\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.061518 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgd7c\" (UID: \"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.062811 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eee5a1d3-cec9-45fb-91ff-77d20bae5c80-srv-cert\") pod \"catalog-operator-68c6474976-k9vz5\" (UID: \"eee5a1d3-cec9-45fb-91ff-77d20bae5c80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.063081 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93b2f0cf-f03b-48f9-9b04-036c07f435b9-cert\") pod \"ingress-canary-b8ftz\" (UID: \"93b2f0cf-f03b-48f9-9b04-036c07f435b9\") " pod="openshift-ingress-canary/ingress-canary-b8ftz" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.063613 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37c53362-c687-4727-97b9-95a9a358cf5b-secret-volume\") pod \"collect-profiles-29330745-kpzmq\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.068745 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f5c6468-8f25-4833-a33e-ffacc52cbaff-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nwg5z\" (UID: \"9f5c6468-8f25-4833-a33e-ffacc52cbaff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.068768 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eee5a1d3-cec9-45fb-91ff-77d20bae5c80-profile-collector-cert\") pod \"catalog-operator-68c6474976-k9vz5\" (UID: \"eee5a1d3-cec9-45fb-91ff-77d20bae5c80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.078301 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.081488 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f24295a8-cbdd-48d0-b48a-b73e86e53fc9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-prnk6\" (UID: \"f24295a8-cbdd-48d0-b48a-b73e86e53fc9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.086518 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-bound-sa-token\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.088144 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/758eea7d-7bcc-4b1e-8690-485cd73090bc-node-bootstrap-token\") pod \"machine-config-server-285fj\" (UID: \"758eea7d-7bcc-4b1e-8690-485cd73090bc\") " pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.088818 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dldt5\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-kube-api-access-dldt5\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.088949 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.102878 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xl5v\" (UniqueName: \"kubernetes.io/projected/794eee5b-ff81-40a4-8e8b-8e97d651d7bc-kube-api-access-5xl5v\") pod \"machine-config-operator-74547568cd-7b25n\" (UID: \"794eee5b-ff81-40a4-8e8b-8e97d651d7bc\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.122799 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk9w9\" (UniqueName: \"kubernetes.io/projected/51567150-6df0-4b38-aec7-d9e78df693dc-kube-api-access-kk9w9\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.126869 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.143735 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51567150-6df0-4b38-aec7-d9e78df693dc-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j6lnh\" (UID: \"51567150-6df0-4b38-aec7-d9e78df693dc\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.144197 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.144339 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:01.644310789 +0000 UTC m=+143.472236581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.144523 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.144843 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:01.644828083 +0000 UTC m=+143.472753875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.169082 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b55vq"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.180191 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kclhg\" (UniqueName: \"kubernetes.io/projected/f24295a8-cbdd-48d0-b48a-b73e86e53fc9-kube-api-access-kclhg\") pod \"olm-operator-6b444d44fb-prnk6\" (UID: \"f24295a8-cbdd-48d0-b48a-b73e86e53fc9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:01 crc kubenswrapper[4717]: W1007 13:56:01.196336 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6ce89a1_e415_49d3_956d_5a63739c4afb.slice/crio-46bbfb158e0b4e9cd41634bdc7ffbc01509f9a0027db59e1ee1da4cef87a3e05 WatchSource:0}: Error finding container 46bbfb158e0b4e9cd41634bdc7ffbc01509f9a0027db59e1ee1da4cef87a3e05: Status 404 returned error can't find the container with id 46bbfb158e0b4e9cd41634bdc7ffbc01509f9a0027db59e1ee1da4cef87a3e05 Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.203570 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49hrs\" (UniqueName: \"kubernetes.io/projected/92eb792a-3195-446f-9b72-6e78a48f7b9e-kube-api-access-49hrs\") pod \"dns-default-gpv6z\" (UID: \"92eb792a-3195-446f-9b72-6e78a48f7b9e\") " pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.220618 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28k6j\" (UniqueName: \"kubernetes.io/projected/bea65594-0f14-4f94-86c9-6f3ae50daf32-kube-api-access-28k6j\") pod \"csi-hostpathplugin-jjtwt\" (UID: \"bea65594-0f14-4f94-86c9-6f3ae50daf32\") " pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.222375 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:01 crc kubenswrapper[4717]: W1007 13:56:01.233957 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf090bea_623b_4671_b682_e0880460ef04.slice/crio-2f3fa534e39218bc559460e8856a38d16d4f1cc9742a030a46fa1a66970575f9 WatchSource:0}: Error finding container 2f3fa534e39218bc559460e8856a38d16d4f1cc9742a030a46fa1a66970575f9: Status 404 returned error can't find the container with id 2f3fa534e39218bc559460e8856a38d16d4f1cc9742a030a46fa1a66970575f9 Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.239535 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.245326 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.245827 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:01.74581138 +0000 UTC m=+143.573737172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.258158 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.260349 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs6x9\" (UniqueName: \"kubernetes.io/projected/eee5a1d3-cec9-45fb-91ff-77d20bae5c80-kube-api-access-xs6x9\") pod \"catalog-operator-68c6474976-k9vz5\" (UID: \"eee5a1d3-cec9-45fb-91ff-77d20bae5c80\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.265865 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f5c6468-8f25-4833-a33e-ffacc52cbaff-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nwg5z\" (UID: \"9f5c6468-8f25-4833-a33e-ffacc52cbaff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.285450 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.286410 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk689\" (UniqueName: \"kubernetes.io/projected/758eea7d-7bcc-4b1e-8690-485cd73090bc-kube-api-access-bk689\") pod \"machine-config-server-285fj\" (UID: \"758eea7d-7bcc-4b1e-8690-485cd73090bc\") " pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.300030 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj6f2\" (UniqueName: \"kubernetes.io/projected/3ad55ef1-703c-423f-9ee5-254322fd0cb2-kube-api-access-vj6f2\") pod \"multus-admission-controller-857f4d67dd-98nvq\" (UID: \"3ad55ef1-703c-423f-9ee5-254322fd0cb2\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.315452 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.319482 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.319698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgp5\" (UniqueName: \"kubernetes.io/projected/37c53362-c687-4727-97b9-95a9a358cf5b-kube-api-access-rkgp5\") pod \"collect-profiles-29330745-kpzmq\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.341517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mx47\" (UniqueName: \"kubernetes.io/projected/2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd-kube-api-access-8mx47\") pod \"kube-storage-version-migrator-operator-b67b599dd-sgd7c\" (UID: \"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.341832 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.347507 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.348600 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:01.848579414 +0000 UTC m=+143.676505206 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.353342 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.357670 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.361886 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv6hp\" (UniqueName: \"kubernetes.io/projected/93b2f0cf-f03b-48f9-9b04-036c07f435b9-kube-api-access-wv6hp\") pod \"ingress-canary-b8ftz\" (UID: \"93b2f0cf-f03b-48f9-9b04-036c07f435b9\") " pod="openshift-ingress-canary/ingress-canary-b8ftz" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.369423 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-79gpc"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.369724 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.371108 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.383740 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtwr8\" (UniqueName: \"kubernetes.io/projected/757f147e-b766-4200-9a8e-c4fbad44fed0-kube-api-access-rtwr8\") pod \"downloads-7954f5f757-2r2zr\" (UID: \"757f147e-b766-4200-9a8e-c4fbad44fed0\") " pod="openshift-console/downloads-7954f5f757-2r2zr" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.397485 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.404469 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.404548 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-285fj" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.418515 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b8ftz" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.441808 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fbrgn"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.457564 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.457916 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:01.957898424 +0000 UTC m=+143.785824216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.465237 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-497gd"] Oct 07 13:56:01 crc kubenswrapper[4717]: W1007 13:56:01.478364 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71387a0e_c68c_4cda_bc81_9c78fab38a52.slice/crio-73ac06a355dcd35bc36f4f474c8ec4080988169166317bf7b6c2e87b86ace370 WatchSource:0}: Error finding container 73ac06a355dcd35bc36f4f474c8ec4080988169166317bf7b6c2e87b86ace370: Status 404 returned error can't find the container with id 73ac06a355dcd35bc36f4f474c8ec4080988169166317bf7b6c2e87b86ace370 Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.497543 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.558826 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-tfzld"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.559390 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.559736 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:02.059722342 +0000 UTC m=+143.887648134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:01 crc kubenswrapper[4717]: W1007 13:56:01.562777 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3fdb570_c6b1_43a3_9726_b3870e21c38d.slice/crio-60d3ce505bf429fc9cb384b70174f4adb58b34f9df066f218b7645ebec673599 WatchSource:0}: Error finding container 60d3ce505bf429fc9cb384b70174f4adb58b34f9df066f218b7645ebec673599: Status 404 returned error can't find the container with id 60d3ce505bf429fc9cb384b70174f4adb58b34f9df066f218b7645ebec673599 Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.589601 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-cr6vm"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.598941 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" event={"ID":"46171099-e5d7-49a0-8e63-f8b9d3f8b0d8","Type":"ContainerStarted","Data":"92145da8f8794b1ec7856aeb3f65707de02aaefe8fddd2e5697b078b9d97d1b2"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.604163 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" event={"ID":"10f378f0-4c92-4b53-9454-f9d83f3cb058","Type":"ContainerStarted","Data":"ebe0920b0d3d54b1fd93a60eab561ff3e2d7900184b58f28b6ed361dde71a938"} Oct 07 13:56:01 crc kubenswrapper[4717]: W1007 13:56:01.604169 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f85d8bb_3f65_40b1_bfd3_f878f44638ce.slice/crio-31b60cbf984dd40e8fd3391ecfb6cee9712f6e208f0627ad944f023feb4cc618 WatchSource:0}: Error finding container 31b60cbf984dd40e8fd3391ecfb6cee9712f6e208f0627ad944f023feb4cc618: Status 404 returned error can't find the container with id 31b60cbf984dd40e8fd3391ecfb6cee9712f6e208f0627ad944f023feb4cc618 Oct 07 13:56:01 crc kubenswrapper[4717]: W1007 13:56:01.608374 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf2c0436_e70a_4f8f_99a8_2cb9fbfe7a6a.slice/crio-b312f70f3bc380e6e24b57879cb398d7e616d5d728c5db6da4eb51b9f43bdb7a WatchSource:0}: Error finding container b312f70f3bc380e6e24b57879cb398d7e616d5d728c5db6da4eb51b9f43bdb7a: Status 404 returned error can't find the container with id b312f70f3bc380e6e24b57879cb398d7e616d5d728c5db6da4eb51b9f43bdb7a Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.609442 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.609482 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.610828 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" event={"ID":"01a33ba0-7f01-465d-b359-4904d6b7a769","Type":"ContainerStarted","Data":"93282c79fa8cf70c6b900f58bc6e741b5715b8aeecbcfbea35ee0353b1871373"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.610856 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" event={"ID":"01a33ba0-7f01-465d-b359-4904d6b7a769","Type":"ContainerStarted","Data":"919696cd11f3a3ba0b019698a87a224216cca87f15e116a1cb3a3154b6409f43"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.612385 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" event={"ID":"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad","Type":"ContainerStarted","Data":"feda8e12e2f7ca7f7864f31b288b6021605afaddd718cdfdf6b0fa200d38fdb3"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.615500 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" event={"ID":"79cce85a-0be0-41da-87c2-66184cff4b22","Type":"ContainerStarted","Data":"491b51a0590484fbcb411d0ce2c69e6590c09d49498755485523f32fcd6102d3"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.618053 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" event={"ID":"b6ce89a1-e415-49d3-956d-5a63739c4afb","Type":"ContainerStarted","Data":"46bbfb158e0b4e9cd41634bdc7ffbc01509f9a0027db59e1ee1da4cef87a3e05"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.619519 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk" event={"ID":"71387a0e-c68c-4cda-bc81-9c78fab38a52","Type":"ContainerStarted","Data":"73ac06a355dcd35bc36f4f474c8ec4080988169166317bf7b6c2e87b86ace370"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.622738 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k6dvg" event={"ID":"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9","Type":"ContainerStarted","Data":"2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.622760 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k6dvg" event={"ID":"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9","Type":"ContainerStarted","Data":"0e954b8d57150b807c4c44d7a6baf1dc613cae375c2afb07f720f4c04b5eff8d"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.627736 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.631389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" event={"ID":"25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a","Type":"ContainerStarted","Data":"37b1349119f0bcba19679692324e0e51eec60b437707a37b52a05a20cb97065d"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.631415 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" event={"ID":"25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a","Type":"ContainerStarted","Data":"528b4c2519bb4d5cb45e75ac4b5aee15d21ec4bdff5fe91892c6046d7ee89739"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.633506 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" event={"ID":"f20492a2-514b-43c3-aa9b-f26b35e14a8a","Type":"ContainerStarted","Data":"b7a73a74a23180b2534ac02ee1538c9b6cbcc2f0339f1b53f6769c462d073b9e"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.635339 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2r2zr" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.635785 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" event={"ID":"c3436d60-53c5-4994-88f1-d3aa555e96bd","Type":"ContainerStarted","Data":"4f57cbb48e7c971b5a10de76c368c8e69858382836d050b92004c1214d39fb2c"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.642954 4717 generic.go:334] "Generic (PLEG): container finished" podID="78d9c154-e844-48ff-80e3-89d1d17f1343" containerID="244846b632569ddef33c2fa241fcfac3a7b571e2eacd5e6b26ed0d4ae137675c" exitCode=0 Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.643201 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" event={"ID":"78d9c154-e844-48ff-80e3-89d1d17f1343","Type":"ContainerDied","Data":"244846b632569ddef33c2fa241fcfac3a7b571e2eacd5e6b26ed0d4ae137675c"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.643230 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" event={"ID":"78d9c154-e844-48ff-80e3-89d1d17f1343","Type":"ContainerStarted","Data":"738c3e85503e507fc89c945475e6467db96a308b6c9e24e6f90ac5946bbd7aae"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.660927 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.661247 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:02.161234463 +0000 UTC m=+143.989160255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.666730 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2v924" event={"ID":"84d6de19-34a7-495c-be2e-71b777330cee","Type":"ContainerStarted","Data":"97527040e4da938af9d98ae6ce07b8b56b2e7f4338ff5e0e23529c52607cf68d"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.666771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2v924" event={"ID":"84d6de19-34a7-495c-be2e-71b777330cee","Type":"ContainerStarted","Data":"fbffb5f150bdf9f8af8dbb6778f1939299b722bf2fd2c5f2515f0a35ca46062b"} Oct 07 13:56:01 crc kubenswrapper[4717]: W1007 13:56:01.670097 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e68626_bdd9_4340_8d96_64d03109427a.slice/crio-baff7ad596957c5708cc9e84fd76543dd802ec9b889626fff5edfeb6f3fb710c WatchSource:0}: Error finding container baff7ad596957c5708cc9e84fd76543dd802ec9b889626fff5edfeb6f3fb710c: Status 404 returned error can't find the container with id baff7ad596957c5708cc9e84fd76543dd802ec9b889626fff5edfeb6f3fb710c Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.670519 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" event={"ID":"df090bea-623b-4671-b682-e0880460ef04","Type":"ContainerStarted","Data":"2f3fa534e39218bc559460e8856a38d16d4f1cc9742a030a46fa1a66970575f9"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.699160 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.707450 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4w2xj" event={"ID":"678c62ea-ce55-4cbd-b400-c5316419c3a9","Type":"ContainerStarted","Data":"8646ea5bfcbb2805537adabb3b3f8ad991a9ff1766643860d1371e18e14dea7b"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.708392 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.717601 4717 patch_prober.go:28] interesting pod/console-operator-58897d9998-4w2xj container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.717661 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.717686 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4w2xj" podUID="678c62ea-ce55-4cbd-b400-c5316419c3a9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.717731 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.720583 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lpb4"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.730209 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" event={"ID":"7547f514-4e34-4e57-ac67-a4d57b1e7e18","Type":"ContainerStarted","Data":"21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.730288 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" event={"ID":"7547f514-4e34-4e57-ac67-a4d57b1e7e18","Type":"ContainerStarted","Data":"8c779563afd65e70fcab3772ed3f72bb7fee4186aa7fc6d2cb391c0cc752a807"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.735443 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.772880 4717 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-6lfbq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.772967 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" podUID="7547f514-4e34-4e57-ac67-a4d57b1e7e18" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.773834 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" event={"ID":"d3fdb570-c6b1-43a3-9726-b3870e21c38d","Type":"ContainerStarted","Data":"60d3ce505bf429fc9cb384b70174f4adb58b34f9df066f218b7645ebec673599"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.780516 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.784935 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:02.284914167 +0000 UTC m=+144.112839959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.848351 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" event={"ID":"88668bb1-9d3d-4761-a7a2-07a57d489243","Type":"ContainerStarted","Data":"4d4ab5bfb160b26e23c3223adc6370e4b9f9caea639e676f85a9dfd28ce0b56f"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.868249 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t9hbf"] Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.871335 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h8764" event={"ID":"df8e1457-a676-4aeb-ad71-6162115e8b58","Type":"ContainerStarted","Data":"a7760c97dbc314d86c759fe627f0a5c8d0b28e58bb8d7359c0609af0d22217a8"} Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.884381 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.886396 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:02.386372196 +0000 UTC m=+144.214297988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:01 crc kubenswrapper[4717]: I1007 13:56:01.989857 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:01 crc kubenswrapper[4717]: E1007 13:56:01.990506 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:02.490494207 +0000 UTC m=+144.318419999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.091318 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:02 crc kubenswrapper[4717]: E1007 13:56:02.091982 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:02.591963235 +0000 UTC m=+144.419889027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.108732 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-c4s7n" podStartSLOduration=123.108716484 podStartE2EDuration="2m3.108716484s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:02.107861761 +0000 UTC m=+143.935787553" watchObservedRunningTime="2025-10-07 13:56:02.108716484 +0000 UTC m=+143.936642276" Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.157219 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n"] Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.178362 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-k6dvg" podStartSLOduration=123.17834404 podStartE2EDuration="2m3.17834404s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:02.17649668 +0000 UTC m=+144.004422492" watchObservedRunningTime="2025-10-07 13:56:02.17834404 +0000 UTC m=+144.006269842" Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.192807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:02 crc kubenswrapper[4717]: E1007 13:56:02.193114 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:02.693102716 +0000 UTC m=+144.521028508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.293678 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:02 crc kubenswrapper[4717]: E1007 13:56:02.294310 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:02.794294977 +0000 UTC m=+144.622220769 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.296099 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z"] Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.332651 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vl9lp" podStartSLOduration=123.332557683 podStartE2EDuration="2m3.332557683s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:02.330776355 +0000 UTC m=+144.158702147" watchObservedRunningTime="2025-10-07 13:56:02.332557683 +0000 UTC m=+144.160483485" Oct 07 13:56:02 crc kubenswrapper[4717]: W1007 13:56:02.351812 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod794eee5b_ff81_40a4_8e8b_8e97d651d7bc.slice/crio-a04c32a1a8ddcd3eff9788571f545d9cadda1ca142860d265322910012f22c55 WatchSource:0}: Error finding container a04c32a1a8ddcd3eff9788571f545d9cadda1ca142860d265322910012f22c55: Status 404 returned error can't find the container with id a04c32a1a8ddcd3eff9788571f545d9cadda1ca142860d265322910012f22c55 Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.379413 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" podStartSLOduration=122.379394428 podStartE2EDuration="2m2.379394428s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:02.376044098 +0000 UTC m=+144.203969900" watchObservedRunningTime="2025-10-07 13:56:02.379394428 +0000 UTC m=+144.207320220" Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.396728 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:02 crc kubenswrapper[4717]: E1007 13:56:02.397126 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:02.897110813 +0000 UTC m=+144.725036605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.451648 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5"] Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.506800 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:02 crc kubenswrapper[4717]: E1007 13:56:02.506990 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.006955627 +0000 UTC m=+144.834881419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.507372 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:02 crc kubenswrapper[4717]: E1007 13:56:02.507781 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.007766428 +0000 UTC m=+144.835692220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.536022 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh"] Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.581963 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6"] Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.583914 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs"] Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.608702 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:02 crc kubenswrapper[4717]: E1007 13:56:02.609193 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.109177406 +0000 UTC m=+144.937103198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: W1007 13:56:02.646505 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51567150_6df0_4b38_aec7_d9e78df693dc.slice/crio-71a4beb28057cb4195599bf6750aeaea47e1b2f8691b2f545451bfb0162d36ea WatchSource:0}: Error finding container 71a4beb28057cb4195599bf6750aeaea47e1b2f8691b2f545451bfb0162d36ea: Status 404 returned error can't find the container with id 71a4beb28057cb4195599bf6750aeaea47e1b2f8691b2f545451bfb0162d36ea Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.716821 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:02 crc kubenswrapper[4717]: E1007 13:56:02.717274 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.217262003 +0000 UTC m=+145.045187795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.717850 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:02 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:02 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:02 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.717905 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.798456 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gpv6z"] Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.812442 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq"] Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.816688 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-4w2xj" podStartSLOduration=123.816675617 podStartE2EDuration="2m3.816675617s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:02.816068441 +0000 UTC m=+144.643994243" watchObservedRunningTime="2025-10-07 13:56:02.816675617 +0000 UTC m=+144.644601399" Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.817357 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:02 crc kubenswrapper[4717]: E1007 13:56:02.817786 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.317769426 +0000 UTC m=+145.145695218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.948273 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:02 crc kubenswrapper[4717]: E1007 13:56:02.948611 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.448595962 +0000 UTC m=+145.276521754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:02 crc kubenswrapper[4717]: I1007 13:56:02.980523 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-2v924" podStartSLOduration=123.980506348 podStartE2EDuration="2m3.980506348s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:02.97761751 +0000 UTC m=+144.805543302" watchObservedRunningTime="2025-10-07 13:56:02.980506348 +0000 UTC m=+144.808432140" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.014087 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-98nvq"] Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.014145 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c"] Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.014162 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b8ftz"] Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.014175 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jjtwt"] Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.049355 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.050171 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.550144384 +0000 UTC m=+145.378070176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.050418 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.050682 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.550672208 +0000 UTC m=+145.378598000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.107673 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" event={"ID":"8f85d8bb-3f65-40b1-bfd3-f878f44638ce","Type":"ContainerStarted","Data":"31b60cbf984dd40e8fd3391ecfb6cee9712f6e208f0627ad944f023feb4cc618"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.120524 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2r2zr"] Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.153521 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.153788 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.653774381 +0000 UTC m=+145.481700173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.157982 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" event={"ID":"25fac7f0-7a1e-4e24-a4e8-fbf33feaec5a","Type":"ContainerStarted","Data":"8b36743866300015b8b749a2e2af18f3e842fa153b25355c72fa3142a3a8490a"} Oct 07 13:56:03 crc kubenswrapper[4717]: W1007 13:56:03.167674 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a7d8fd9_1ef3_407e_b3bf_1f02d7ea1cfd.slice/crio-4137b630b0e9f9328af5e54271cbf4392b51132cd23616d83bcb9320485642ec WatchSource:0}: Error finding container 4137b630b0e9f9328af5e54271cbf4392b51132cd23616d83bcb9320485642ec: Status 404 returned error can't find the container with id 4137b630b0e9f9328af5e54271cbf4392b51132cd23616d83bcb9320485642ec Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.182049 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" event={"ID":"1281b203-c580-4ebb-8c75-d6c8bafb3ca2","Type":"ContainerStarted","Data":"9ce0fbc260678964cb3a325f3b78fb24a85cbc8f17bb482e90e33adc93115403"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.184879 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.188203 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" event={"ID":"51567150-6df0-4b38-aec7-d9e78df693dc","Type":"ContainerStarted","Data":"71a4beb28057cb4195599bf6750aeaea47e1b2f8691b2f545451bfb0162d36ea"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.205817 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t9hbf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.205889 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" podUID="1281b203-c580-4ebb-8c75-d6c8bafb3ca2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.212180 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" event={"ID":"794eee5b-ff81-40a4-8e8b-8e97d651d7bc","Type":"ContainerStarted","Data":"a04c32a1a8ddcd3eff9788571f545d9cadda1ca142860d265322910012f22c55"} Oct 07 13:56:03 crc kubenswrapper[4717]: W1007 13:56:03.216515 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ad55ef1_703c_423f_9ee5_254322fd0cb2.slice/crio-8b7b6c5e74d96c69a24aa4aea5edfcab3717df5382a8e1b4dc0a5d533ff53a88 WatchSource:0}: Error finding container 8b7b6c5e74d96c69a24aa4aea5edfcab3717df5382a8e1b4dc0a5d533ff53a88: Status 404 returned error can't find the container with id 8b7b6c5e74d96c69a24aa4aea5edfcab3717df5382a8e1b4dc0a5d533ff53a88 Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.219440 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" event={"ID":"a24414df-b777-4ee7-8d31-41e802737e89","Type":"ContainerStarted","Data":"9addad2e0ef77fe64607d17b1c2cddcd433d01a4d5c128f79aff33b497caa1d2"} Oct 07 13:56:03 crc kubenswrapper[4717]: W1007 13:56:03.222706 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93b2f0cf_f03b_48f9_9b04_036c07f435b9.slice/crio-bd84633510a491fcd4b32a3f97957a47f8b5cb91dd0fd31a35586626c0514362 WatchSource:0}: Error finding container bd84633510a491fcd4b32a3f97957a47f8b5cb91dd0fd31a35586626c0514362: Status 404 returned error can't find the container with id bd84633510a491fcd4b32a3f97957a47f8b5cb91dd0fd31a35586626c0514362 Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.231720 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" event={"ID":"f20492a2-514b-43c3-aa9b-f26b35e14a8a","Type":"ContainerStarted","Data":"f320a29d9b234aaaadf5c0f1f62859d084f802d8bb5b9b4730905d5e3ffdcdb8"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.231758 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" event={"ID":"f20492a2-514b-43c3-aa9b-f26b35e14a8a","Type":"ContainerStarted","Data":"15aa88b4c2afb95f467904e78b6b0dcc56432f4f2fdb908e258af55e81a6292b"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.232281 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.235730 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" event={"ID":"c3436d60-53c5-4994-88f1-d3aa555e96bd","Type":"ContainerStarted","Data":"181c9886319e1916d326c6ae609e69d77ee95ca046774c3afe5dfb38470ee4af"} Oct 07 13:56:03 crc kubenswrapper[4717]: W1007 13:56:03.242254 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbea65594_0f14_4f94_86c9_6f3ae50daf32.slice/crio-9d1fa11ba3f05d82e6970612b5c826eb813f40ce6d42531621aebc9b86e5a646 WatchSource:0}: Error finding container 9d1fa11ba3f05d82e6970612b5c826eb813f40ce6d42531621aebc9b86e5a646: Status 404 returned error can't find the container with id 9d1fa11ba3f05d82e6970612b5c826eb813f40ce6d42531621aebc9b86e5a646 Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.252694 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" event={"ID":"67e983f2-2f6b-40c2-9690-bf0199c02f04","Type":"ContainerStarted","Data":"3e664b772462a6f778f7f49fa8782fe4655479c23c5b9d2b26ee2f1ed6c5559f"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.253065 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.254577 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.255059 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.755044835 +0000 UTC m=+145.582970627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.259901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" event={"ID":"88668bb1-9d3d-4761-a7a2-07a57d489243","Type":"ContainerStarted","Data":"408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.260952 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.270560 4717 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-9lpb4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.270597 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" podUID="67e983f2-2f6b-40c2-9690-bf0199c02f04" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.287117 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" event={"ID":"46171099-e5d7-49a0-8e63-f8b9d3f8b0d8","Type":"ContainerStarted","Data":"16aad0944a0daf96b6ae385624a590cc92c75fc1d337c0f7a444e83fa4b91f9c"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.295477 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-285fj" event={"ID":"758eea7d-7bcc-4b1e-8690-485cd73090bc","Type":"ContainerStarted","Data":"5f561b2a18465a1c799bddddee9d3dad5d29d4ffae237a5de676f5294fc97052"} Oct 07 13:56:03 crc kubenswrapper[4717]: W1007 13:56:03.300136 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod757f147e_b766_4200_9a8e_c4fbad44fed0.slice/crio-0acfcd60ded6d008b6107e26ce83df4c3d05491ff1148dd194c0e8ad91ee875f WatchSource:0}: Error finding container 0acfcd60ded6d008b6107e26ce83df4c3d05491ff1148dd194c0e8ad91ee875f: Status 404 returned error can't find the container with id 0acfcd60ded6d008b6107e26ce83df4c3d05491ff1148dd194c0e8ad91ee875f Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.308761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" event={"ID":"eee5a1d3-cec9-45fb-91ff-77d20bae5c80","Type":"ContainerStarted","Data":"bc5ec327aa4d6ea33071ad128d06d447b7d03c13bdef255ba61d5dea52415e0a"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.309993 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.312446 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" event={"ID":"f24295a8-cbdd-48d0-b48a-b73e86e53fc9","Type":"ContainerStarted","Data":"d36f74ad0f97047b31abb9ee85c80586c279437f75a356ab5561902be301701d"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.314636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" event={"ID":"92e81bbc-e8d4-414d-8e09-d8bff498f83f","Type":"ContainerStarted","Data":"dc74e25d5870c1a0a93b0944e637a6190427ba7ea5c12e429033c4d7746104bc"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.314891 4717 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-k9vz5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.314924 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" podUID="eee5a1d3-cec9-45fb-91ff-77d20bae5c80" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.321653 4717 generic.go:334] "Generic (PLEG): container finished" podID="df090bea-623b-4671-b682-e0880460ef04" containerID="c7dccff4c3dbc292ffbed64457e308eb8bcd6d8773956a7592eba13bf00629b1" exitCode=0 Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.321736 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" event={"ID":"df090bea-623b-4671-b682-e0880460ef04","Type":"ContainerDied","Data":"c7dccff4c3dbc292ffbed64457e308eb8bcd6d8773956a7592eba13bf00629b1"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.342027 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h8764" event={"ID":"df8e1457-a676-4aeb-ad71-6162115e8b58","Type":"ContainerStarted","Data":"38b4e2705758e933de960afc05e353be60e33d3b50ccbac4ebadcd4b8ee90637"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.355651 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.359664 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.859624497 +0000 UTC m=+145.687550289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.380527 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" event={"ID":"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a","Type":"ContainerStarted","Data":"b312f70f3bc380e6e24b57879cb398d7e616d5d728c5db6da4eb51b9f43bdb7a"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.388901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" event={"ID":"e0e68626-bdd9-4340-8d96-64d03109427a","Type":"ContainerStarted","Data":"baff7ad596957c5708cc9e84fd76543dd802ec9b889626fff5edfeb6f3fb710c"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.424814 4717 generic.go:334] "Generic (PLEG): container finished" podID="e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad" containerID="78a627aa94e1f1d3552e74e12698db1d58b6f13d45013ca9c21641fe61194e46" exitCode=0 Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.424949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" event={"ID":"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad","Type":"ContainerDied","Data":"78a627aa94e1f1d3552e74e12698db1d58b6f13d45013ca9c21641fe61194e46"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.432115 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" event={"ID":"b6ce89a1-e415-49d3-956d-5a63739c4afb","Type":"ContainerStarted","Data":"a14d468d3c0f1540c444f0ce0550ee9b055c4fa9802cb8738eb9c95449958377"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.433267 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.444291 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" event={"ID":"79cce85a-0be0-41da-87c2-66184cff4b22","Type":"ContainerStarted","Data":"f46ab6c844aa5bb1800937c2296ffadf2ab48761bf9a57a777a87cc384a6f5f9"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.446127 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" event={"ID":"9f5c6468-8f25-4833-a33e-ffacc52cbaff","Type":"ContainerStarted","Data":"f92f900f27523cb716b1fb9e9e763e49ce8876ffc6b5c57bf1757a6c265c5abb"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.449114 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" event={"ID":"10f378f0-4c92-4b53-9454-f9d83f3cb058","Type":"ContainerStarted","Data":"b312ce381287c871f50fc76fd67c88f59bef0a941e9e244c08782f5b413dcc62"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.457786 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" event={"ID":"a923ae7c-d6cd-4d73-a547-27ef6581da72","Type":"ContainerStarted","Data":"62dddac5ba6ca44fec1824a2e06acc115b04ea78e7190e8bbc727985560ffdcc"} Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.464174 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.467782 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.470707 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:03.970691024 +0000 UTC m=+145.798616816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.474131 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-4w2xj" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.566691 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.567058 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.067041376 +0000 UTC m=+145.894967168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.596702 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" podStartSLOduration=123.596676211 podStartE2EDuration="2m3.596676211s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:03.577156737 +0000 UTC m=+145.405082539" watchObservedRunningTime="2025-10-07 13:56:03.596676211 +0000 UTC m=+145.424602023" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.619901 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.627045 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8qs4m" podStartSLOduration=123.627021414 podStartE2EDuration="2m3.627021414s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:03.614234201 +0000 UTC m=+145.442160013" watchObservedRunningTime="2025-10-07 13:56:03.627021414 +0000 UTC m=+145.454947206" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.655286 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" podStartSLOduration=124.655271881 podStartE2EDuration="2m4.655271881s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:03.653980336 +0000 UTC m=+145.481906128" watchObservedRunningTime="2025-10-07 13:56:03.655271881 +0000 UTC m=+145.483197673" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.669715 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.670065 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.170052887 +0000 UTC m=+145.997978679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.705651 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:03 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:03 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:03 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.705708 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.729963 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-285fj" podStartSLOduration=5.729944192 podStartE2EDuration="5.729944192s" podCreationTimestamp="2025-10-07 13:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:03.729945482 +0000 UTC m=+145.557871274" watchObservedRunningTime="2025-10-07 13:56:03.729944192 +0000 UTC m=+145.557869984" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.771218 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.771595 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.271580498 +0000 UTC m=+146.099506290 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.823871 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" podStartSLOduration=123.823853909 podStartE2EDuration="2m3.823853909s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:03.822860052 +0000 UTC m=+145.650785844" watchObservedRunningTime="2025-10-07 13:56:03.823853909 +0000 UTC m=+145.651779701" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.829359 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.861712 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" podStartSLOduration=124.861690193 podStartE2EDuration="2m4.861690193s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:03.854646564 +0000 UTC m=+145.682572356" watchObservedRunningTime="2025-10-07 13:56:03.861690193 +0000 UTC m=+145.689615985" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.872404 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.872679 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.372668227 +0000 UTC m=+146.200594019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.899603 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lclzf" podStartSLOduration=123.899585438 podStartE2EDuration="2m3.899585438s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:03.896344481 +0000 UTC m=+145.724270273" watchObservedRunningTime="2025-10-07 13:56:03.899585438 +0000 UTC m=+145.727511230" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.934311 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8cbq8" podStartSLOduration=123.934292518 podStartE2EDuration="2m3.934292518s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:03.932824339 +0000 UTC m=+145.760750131" watchObservedRunningTime="2025-10-07 13:56:03.934292518 +0000 UTC m=+145.762218310" Oct 07 13:56:03 crc kubenswrapper[4717]: I1007 13:56:03.972981 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:03 crc kubenswrapper[4717]: E1007 13:56:03.973312 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.473298024 +0000 UTC m=+146.301223816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.002214 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" podStartSLOduration=124.002199528 podStartE2EDuration="2m4.002199528s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:03.999361042 +0000 UTC m=+145.827286834" watchObservedRunningTime="2025-10-07 13:56:04.002199528 +0000 UTC m=+145.830125320" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.027871 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" podStartSLOduration=124.027854716 podStartE2EDuration="2m4.027854716s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.025351419 +0000 UTC m=+145.853277221" watchObservedRunningTime="2025-10-07 13:56:04.027854716 +0000 UTC m=+145.855780508" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.074849 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:04 crc kubenswrapper[4717]: E1007 13:56:04.075208 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.575192734 +0000 UTC m=+146.403118516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.118307 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cs9pq" podStartSLOduration=125.118291509 podStartE2EDuration="2m5.118291509s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.105216369 +0000 UTC m=+145.933142161" watchObservedRunningTime="2025-10-07 13:56:04.118291509 +0000 UTC m=+145.946217301" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.176532 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:04 crc kubenswrapper[4717]: E1007 13:56:04.176664 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.676642213 +0000 UTC m=+146.504568015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.177029 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:04 crc kubenswrapper[4717]: E1007 13:56:04.177448 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.677438275 +0000 UTC m=+146.505364067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.196775 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" podStartSLOduration=125.196756592 podStartE2EDuration="2m5.196756592s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.145314864 +0000 UTC m=+145.973240666" watchObservedRunningTime="2025-10-07 13:56:04.196756592 +0000 UTC m=+146.024682384" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.281603 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:04 crc kubenswrapper[4717]: E1007 13:56:04.281934 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.781910584 +0000 UTC m=+146.609836376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.386601 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:04 crc kubenswrapper[4717]: E1007 13:56:04.387063 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.887043862 +0000 UTC m=+146.714969754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.466757 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-497gd" event={"ID":"cf2c0436-e70a-4f8f-99a8-2cb9fbfe7a6a","Type":"ContainerStarted","Data":"60342e0ee19dff3f0537c75cf0030aef1b1cbf6b5515b908138c2d4a6b4da775"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.474172 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" event={"ID":"e0e68626-bdd9-4340-8d96-64d03109427a","Type":"ContainerStarted","Data":"e6b8bf30f6aa09531fa6b1918f45f9b7f93bddeac9378af3e4f25ce6d466f072"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.488999 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:04 crc kubenswrapper[4717]: E1007 13:56:04.489537 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:04.989516138 +0000 UTC m=+146.817441930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.492406 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-285fj" event={"ID":"758eea7d-7bcc-4b1e-8690-485cd73090bc","Type":"ContainerStarted","Data":"a4bd6d3ebc8495a2eb1e0e8302b239f442793c9e02be2353a3b8638dd3ed5eb5"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.507488 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mvs2v" podStartSLOduration=124.507474529 podStartE2EDuration="2m4.507474529s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.506406361 +0000 UTC m=+146.334332163" watchObservedRunningTime="2025-10-07 13:56:04.507474529 +0000 UTC m=+146.335400311" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.507815 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" event={"ID":"a923ae7c-d6cd-4d73-a547-27ef6581da72","Type":"ContainerStarted","Data":"23bff8fc0a3bcbcbd1f2765f87dc7abc99605f460914622cf97fb23b54b44436"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.507865 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" event={"ID":"a923ae7c-d6cd-4d73-a547-27ef6581da72","Type":"ContainerStarted","Data":"035dff2cc13be32dd1a66c5a4e0dcbaceba052f34cbf9f8363dd5b349c65e1c8"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.517883 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" event={"ID":"10f378f0-4c92-4b53-9454-f9d83f3cb058","Type":"ContainerStarted","Data":"b4eeaa121942b792b71a60871548d93b028f7e0af7fab4d6af72706aaf2d6da9"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.521177 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" event={"ID":"1281b203-c580-4ebb-8c75-d6c8bafb3ca2","Type":"ContainerStarted","Data":"af55360ba3e38799ba15c8bc160b0db06419043ac145b041c6c442391f09b983"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.524718 4717 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-t9hbf container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.524783 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" podUID="1281b203-c580-4ebb-8c75-d6c8bafb3ca2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.17:8080/healthz\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.556096 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mt9vs" podStartSLOduration=124.556078012 podStartE2EDuration="2m4.556078012s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.554659144 +0000 UTC m=+146.382584936" watchObservedRunningTime="2025-10-07 13:56:04.556078012 +0000 UTC m=+146.384003804" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.561175 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" event={"ID":"78d9c154-e844-48ff-80e3-89d1d17f1343","Type":"ContainerStarted","Data":"77e35448c5b7f835e12d53336c60a7883fc097b8401496e2c618d9ead0b340d4"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.591793 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:04 crc kubenswrapper[4717]: E1007 13:56:04.592936 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:05.092925839 +0000 UTC m=+146.920851631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.598904 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gpv6z" event={"ID":"92eb792a-3195-446f-9b72-6e78a48f7b9e","Type":"ContainerStarted","Data":"5a8e75a6ef690d6c3db0ae012c26d4ea288a2b646ecdefb9d4f4ac4e263e03db"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.598945 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gpv6z" event={"ID":"92eb792a-3195-446f-9b72-6e78a48f7b9e","Type":"ContainerStarted","Data":"134f9bf20089613fb15e5ed47d0bd74ea44d046e2d2d6343046f5d0a681c2c73"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.598954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gpv6z" event={"ID":"92eb792a-3195-446f-9b72-6e78a48f7b9e","Type":"ContainerStarted","Data":"c25c740dd02c9fc9f86501bcff702168ef9cc80623515169f7c9ac4ca2bda430"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.599485 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.615789 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-cr6vm" event={"ID":"a24414df-b777-4ee7-8d31-41e802737e89","Type":"ContainerStarted","Data":"6d48cb088fa0517f002e9d4d14ff5d355bd0fbe49cf3090a81c5a9e5b12c308f"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.620429 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d79w8" podStartSLOduration=125.620414016 podStartE2EDuration="2m5.620414016s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.583572669 +0000 UTC m=+146.411498481" watchObservedRunningTime="2025-10-07 13:56:04.620414016 +0000 UTC m=+146.448339798" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.643686 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" event={"ID":"df090bea-623b-4671-b682-e0880460ef04","Type":"ContainerStarted","Data":"70a39836ef819ed47af916cab5a9d72058c7fe663cb815fe696c7d16294d16ab"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.644624 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.645822 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" podStartSLOduration=124.645812927 podStartE2EDuration="2m4.645812927s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.619788799 +0000 UTC m=+146.447714591" watchObservedRunningTime="2025-10-07 13:56:04.645812927 +0000 UTC m=+146.473738719" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.646679 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gpv6z" podStartSLOduration=6.64667453 podStartE2EDuration="6.64667453s" podCreationTimestamp="2025-10-07 13:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.644793649 +0000 UTC m=+146.472719441" watchObservedRunningTime="2025-10-07 13:56:04.64667453 +0000 UTC m=+146.474600322" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.656699 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" event={"ID":"eee5a1d3-cec9-45fb-91ff-77d20bae5c80","Type":"ContainerStarted","Data":"920f93a67e1001ef395dda48dac760faf7fe69d208d0c4d8aaec96b2bb87179e"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.669121 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k9vz5" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.672509 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" podStartSLOduration=125.672489612 podStartE2EDuration="2m5.672489612s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.666406179 +0000 UTC m=+146.494331971" watchObservedRunningTime="2025-10-07 13:56:04.672489612 +0000 UTC m=+146.500415404" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.686099 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-h8764" event={"ID":"df8e1457-a676-4aeb-ad71-6162115e8b58","Type":"ContainerStarted","Data":"746db4c0044ec06627f4c372f74967df7728bc978aedf4c50b5e86959ddbeb56"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.693455 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:04 crc kubenswrapper[4717]: E1007 13:56:04.694565 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:05.194544883 +0000 UTC m=+147.022470675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.705423 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:04 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:04 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:04 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.705504 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.732315 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-h8764" podStartSLOduration=125.732288484 podStartE2EDuration="2m5.732288484s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.724444664 +0000 UTC m=+146.552370466" watchObservedRunningTime="2025-10-07 13:56:04.732288484 +0000 UTC m=+146.560214286" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.744603 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" event={"ID":"c3436d60-53c5-4994-88f1-d3aa555e96bd","Type":"ContainerStarted","Data":"bd16c26616b90daaf073d4034a191f3e0b022278bb83279c974b6748b27d3bae"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.750896 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" event={"ID":"92e81bbc-e8d4-414d-8e09-d8bff498f83f","Type":"ContainerStarted","Data":"f2ab5ef0b6aff82168a1c94e73a547e2fb09ba1c71efcfcc83ff76a941550f39"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.753912 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2r2zr" event={"ID":"757f147e-b766-4200-9a8e-c4fbad44fed0","Type":"ContainerStarted","Data":"4ef5be7ce1795e42a84733ebee29099b5df1745ec40f233b36ccdbc4d20bc3b6"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.753981 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2r2zr" event={"ID":"757f147e-b766-4200-9a8e-c4fbad44fed0","Type":"ContainerStarted","Data":"0acfcd60ded6d008b6107e26ce83df4c3d05491ff1148dd194c0e8ad91ee875f"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.754321 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2r2zr" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.755636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" event={"ID":"37c53362-c687-4727-97b9-95a9a358cf5b","Type":"ContainerStarted","Data":"8b01b86cce3843dc7fc1f14264207948424bd64f2051d7f85e0867b3cd4fed82"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.755658 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" event={"ID":"37c53362-c687-4727-97b9-95a9a358cf5b","Type":"ContainerStarted","Data":"4af5c2b8411603202d7e90871156a82dea6de2d3e3281f627e73dc8f09fda1d1"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.756991 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-2r2zr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.757034 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2r2zr" podUID="757f147e-b766-4200-9a8e-c4fbad44fed0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.758574 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" event={"ID":"f24295a8-cbdd-48d0-b48a-b73e86e53fc9","Type":"ContainerStarted","Data":"1a2eaa656f3b0c1b771212d5a91f69bb416c23b7e508577c435e2d0c796d094a"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.759147 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.768437 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" event={"ID":"794eee5b-ff81-40a4-8e8b-8e97d651d7bc","Type":"ContainerStarted","Data":"1489efdcb5acace7aa5b19112839ada3042fbe613360c13d5f9351e954c3d4d0"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.768471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" event={"ID":"794eee5b-ff81-40a4-8e8b-8e97d651d7bc","Type":"ContainerStarted","Data":"cafb87c7a95e1540e2244e7871cd5fe32f6617b253648bfff8f5288316b631b0"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.781844 4717 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-prnk6 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.781901 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" podUID="f24295a8-cbdd-48d0-b48a-b73e86e53fc9" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.791455 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" event={"ID":"d3fdb570-c6b1-43a3-9726-b3870e21c38d","Type":"ContainerStarted","Data":"84980f043e1404437f89dfe007d91a8277f1998d2ec2566f49dc63b71a7e2f7c"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.798284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:04 crc kubenswrapper[4717]: E1007 13:56:04.800392 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:05.300377569 +0000 UTC m=+147.128303361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.803683 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" event={"ID":"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd","Type":"ContainerStarted","Data":"3f33226875b869bff542b891db823c40db19afe7b15b12f934082897c56a1526"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.803715 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" event={"ID":"2a7d8fd9-1ef3-407e-b3bf-1f02d7ea1cfd","Type":"ContainerStarted","Data":"4137b630b0e9f9328af5e54271cbf4392b51132cd23616d83bcb9320485642ec"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.805834 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" event={"ID":"3ad55ef1-703c-423f-9ee5-254322fd0cb2","Type":"ContainerStarted","Data":"099c6044d0eeb5c2485edd3f35cf1b6cb3110c41e03763d2468b74b29174e709"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.805855 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" event={"ID":"3ad55ef1-703c-423f-9ee5-254322fd0cb2","Type":"ContainerStarted","Data":"8b7b6c5e74d96c69a24aa4aea5edfcab3717df5382a8e1b4dc0a5d533ff53a88"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.807221 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" event={"ID":"8f85d8bb-3f65-40b1-bfd3-f878f44638ce","Type":"ContainerStarted","Data":"ba97a9835b24bfe55401da6ef4cb780bdd8d0b5b5fb2c05b38f5666b6f1ffc73"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.808271 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" event={"ID":"bea65594-0f14-4f94-86c9-6f3ae50daf32","Type":"ContainerStarted","Data":"9d1fa11ba3f05d82e6970612b5c826eb813f40ce6d42531621aebc9b86e5a646"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.813433 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b8ftz" event={"ID":"93b2f0cf-f03b-48f9-9b04-036c07f435b9","Type":"ContainerStarted","Data":"a1bac17071ab220e8d03111e67e717fcd2deb078ca863fdd0dbc60784e11ab9a"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.813469 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b8ftz" event={"ID":"93b2f0cf-f03b-48f9-9b04-036c07f435b9","Type":"ContainerStarted","Data":"bd84633510a491fcd4b32a3f97957a47f8b5cb91dd0fd31a35586626c0514362"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.823343 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-79gpc" podStartSLOduration=124.823328074 podStartE2EDuration="2m4.823328074s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.780122396 +0000 UTC m=+146.608048188" watchObservedRunningTime="2025-10-07 13:56:04.823328074 +0000 UTC m=+146.651253866" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.833356 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" event={"ID":"9f5c6468-8f25-4833-a33e-ffacc52cbaff","Type":"ContainerStarted","Data":"e18ef5a0e9b4e979e242a3a56c7ebdbad383cd8ad16d642fa2bbeb340be4b0d3"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.856251 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" event={"ID":"51567150-6df0-4b38-aec7-d9e78df693dc","Type":"ContainerStarted","Data":"5f1534526986777660e4c05824a61126c0f1c0b35ab2f34d9028d04877f5b3f9"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.856319 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" event={"ID":"51567150-6df0-4b38-aec7-d9e78df693dc","Type":"ContainerStarted","Data":"df87e9fdcafd411b7269d3721b610343e7c9374ba21bbbbd3bd6f6ca2245d35a"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.866545 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" podStartSLOduration=124.866529142 podStartE2EDuration="2m4.866529142s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.86533881 +0000 UTC m=+146.693264602" watchObservedRunningTime="2025-10-07 13:56:04.866529142 +0000 UTC m=+146.694454934" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.867451 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" podStartSLOduration=125.867444036 podStartE2EDuration="2m5.867444036s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.834120703 +0000 UTC m=+146.662046485" watchObservedRunningTime="2025-10-07 13:56:04.867444036 +0000 UTC m=+146.695369838" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.892776 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk" event={"ID":"71387a0e-c68c-4cda-bc81-9c78fab38a52","Type":"ContainerStarted","Data":"9be9ee1b73c0fae94b14562771a5a6783f865ffacbe871e55089b009f93164e3"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.893077 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk" event={"ID":"71387a0e-c68c-4cda-bc81-9c78fab38a52","Type":"ContainerStarted","Data":"57277d2f7236ef0daad1b6a4472064547ef97e6f4bab41ee13c73db6683a7426"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.897729 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2r2zr" podStartSLOduration=125.897713447 podStartE2EDuration="2m5.897713447s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.89743852 +0000 UTC m=+146.725364312" watchObservedRunningTime="2025-10-07 13:56:04.897713447 +0000 UTC m=+146.725639239" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.900516 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:04 crc kubenswrapper[4717]: E1007 13:56:04.900899 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:05.400878122 +0000 UTC m=+147.228803914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.908556 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" event={"ID":"67e983f2-2f6b-40c2-9690-bf0199c02f04","Type":"ContainerStarted","Data":"7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf"} Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.919021 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.965714 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fbrgn" podStartSLOduration=125.965696659 podStartE2EDuration="2m5.965696659s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.963748277 +0000 UTC m=+146.791674089" watchObservedRunningTime="2025-10-07 13:56:04.965696659 +0000 UTC m=+146.793622451" Oct 07 13:56:04 crc kubenswrapper[4717]: I1007 13:56:04.965809 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7b25n" podStartSLOduration=124.965805172 podStartE2EDuration="2m4.965805172s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:04.929582872 +0000 UTC m=+146.757508664" watchObservedRunningTime="2025-10-07 13:56:04.965805172 +0000 UTC m=+146.793730964" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.019073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.031275 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:05.531253186 +0000 UTC m=+147.359178978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.069303 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" podStartSLOduration=125.069278175 podStartE2EDuration="2m5.069278175s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:05.055519517 +0000 UTC m=+146.883445309" watchObservedRunningTime="2025-10-07 13:56:05.069278175 +0000 UTC m=+146.897203967" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.095462 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j6lnh" podStartSLOduration=126.095445837 podStartE2EDuration="2m6.095445837s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:05.094783449 +0000 UTC m=+146.922709241" watchObservedRunningTime="2025-10-07 13:56:05.095445837 +0000 UTC m=+146.923371629" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.119676 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.119971 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:05.619952583 +0000 UTC m=+147.447878375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.140697 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-tfzld" podStartSLOduration=125.140678489 podStartE2EDuration="2m5.140678489s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:05.140329799 +0000 UTC m=+146.968255591" watchObservedRunningTime="2025-10-07 13:56:05.140678489 +0000 UTC m=+146.968604281" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.162379 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b8ftz" podStartSLOduration=7.16236259 podStartE2EDuration="7.16236259s" podCreationTimestamp="2025-10-07 13:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:05.161567129 +0000 UTC m=+146.989492921" watchObservedRunningTime="2025-10-07 13:56:05.16236259 +0000 UTC m=+146.990288382" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.194834 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v2qhz" podStartSLOduration=126.194605404 podStartE2EDuration="2m6.194605404s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:05.192836797 +0000 UTC m=+147.020762599" watchObservedRunningTime="2025-10-07 13:56:05.194605404 +0000 UTC m=+147.022531196" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.221367 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.221752 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:05.721725281 +0000 UTC m=+147.549651073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.246661 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-njglk" podStartSLOduration=125.246645039 podStartE2EDuration="2m5.246645039s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:05.22055807 +0000 UTC m=+147.048483882" watchObservedRunningTime="2025-10-07 13:56:05.246645039 +0000 UTC m=+147.074570831" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.246792 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-sgd7c" podStartSLOduration=125.246788983 podStartE2EDuration="2m5.246788983s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:05.246285229 +0000 UTC m=+147.074211021" watchObservedRunningTime="2025-10-07 13:56:05.246788983 +0000 UTC m=+147.074714775" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.276649 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nwg5z" podStartSLOduration=125.276630332 podStartE2EDuration="2m5.276630332s" podCreationTimestamp="2025-10-07 13:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:05.276092258 +0000 UTC m=+147.104018060" watchObservedRunningTime="2025-10-07 13:56:05.276630332 +0000 UTC m=+147.104556124" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.322141 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.322504 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:05.822489591 +0000 UTC m=+147.650415383 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.339123 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.339539 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.353309 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.424318 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.424927 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:05.924911386 +0000 UTC m=+147.752837178 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.525336 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.525427 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.025408309 +0000 UTC m=+147.853334101 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.525609 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.526097 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.026086498 +0000 UTC m=+147.854012280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.626707 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.626853 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.126833966 +0000 UTC m=+147.954759748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.626977 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.627338 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.12731896 +0000 UTC m=+147.955244802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.702370 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:05 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:05 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:05 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.702678 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.728235 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.728353 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.228332177 +0000 UTC m=+148.056257969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.728432 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.728741 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.228732467 +0000 UTC m=+148.056658259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.829464 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.829835 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.329784775 +0000 UTC m=+148.157710567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.913864 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-98nvq" event={"ID":"3ad55ef1-703c-423f-9ee5-254322fd0cb2","Type":"ContainerStarted","Data":"a6cc36d84523eff1ce2d9d58dac112531d910811cfd87c445e20c45eace12c43"} Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.915689 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" event={"ID":"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad","Type":"ContainerStarted","Data":"b08a443d2ddc8e36b892146b6dbe15352fd1409fab68194fe118512de6bfe415"} Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.915727 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" event={"ID":"e828154d-3ba2-4fb8-8f56-b8b73dd4c9ad","Type":"ContainerStarted","Data":"6c7bce4b29d34f8a05ddb2e25b24acf190c12145aa4a4b91691fd424f5fb801e"} Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.916916 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" event={"ID":"bea65594-0f14-4f94-86c9-6f3ae50daf32","Type":"ContainerStarted","Data":"46e74adbedc840a4b2be8386f25571ff6a8d5dc4d520a0b54abd1cb038d62fc1"} Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.918146 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-2r2zr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.918180 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2r2zr" podUID="757f147e-b766-4200-9a8e-c4fbad44fed0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.938537 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.939417 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.939824 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-qb2bx" Oct 07 13:56:05 crc kubenswrapper[4717]: E1007 13:56:05.940044 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.44003192 +0000 UTC m=+148.267957712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.964495 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-prnk6" Oct 07 13:56:05 crc kubenswrapper[4717]: I1007 13:56:05.986595 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" podStartSLOduration=126.986574857 podStartE2EDuration="2m6.986574857s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:05.955270018 +0000 UTC m=+147.783195810" watchObservedRunningTime="2025-10-07 13:56:05.986574857 +0000 UTC m=+147.814500649" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.046640 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:06 crc kubenswrapper[4717]: E1007 13:56:06.058285 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.558257028 +0000 UTC m=+148.386182820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.068095 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gmhlf"] Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.072654 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.079320 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.088505 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gmhlf"] Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.160654 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-utilities\") pod \"community-operators-gmhlf\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.160705 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljwq\" (UniqueName: \"kubernetes.io/projected/dd8359d4-1d07-4d74-955a-25c5471f3817-kube-api-access-nljwq\") pod \"community-operators-gmhlf\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.160752 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.160793 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-catalog-content\") pod \"community-operators-gmhlf\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: E1007 13:56:06.161104 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.661091324 +0000 UTC m=+148.489017116 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.228212 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xxhfh"] Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.229178 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.235337 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.249148 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxhfh"] Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.263533 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.263876 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-catalog-content\") pod \"community-operators-gmhlf\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.263935 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-utilities\") pod \"certified-operators-xxhfh\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.263970 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2sn6\" (UniqueName: \"kubernetes.io/projected/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-kube-api-access-r2sn6\") pod \"certified-operators-xxhfh\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.264031 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-catalog-content\") pod \"certified-operators-xxhfh\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.264054 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-utilities\") pod \"community-operators-gmhlf\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.264078 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljwq\" (UniqueName: \"kubernetes.io/projected/dd8359d4-1d07-4d74-955a-25c5471f3817-kube-api-access-nljwq\") pod \"community-operators-gmhlf\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: E1007 13:56:06.264405 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.764389263 +0000 UTC m=+148.592315055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.264747 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-catalog-content\") pod \"community-operators-gmhlf\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.264871 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-utilities\") pod \"community-operators-gmhlf\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.318139 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljwq\" (UniqueName: \"kubernetes.io/projected/dd8359d4-1d07-4d74-955a-25c5471f3817-kube-api-access-nljwq\") pod \"community-operators-gmhlf\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.365686 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-utilities\") pod \"certified-operators-xxhfh\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.366707 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2sn6\" (UniqueName: \"kubernetes.io/projected/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-kube-api-access-r2sn6\") pod \"certified-operators-xxhfh\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.366811 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-catalog-content\") pod \"certified-operators-xxhfh\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.366945 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:06 crc kubenswrapper[4717]: E1007 13:56:06.367331 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.867317141 +0000 UTC m=+148.695242933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.367885 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-catalog-content\") pod \"certified-operators-xxhfh\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.367986 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-utilities\") pod \"certified-operators-xxhfh\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.398617 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2sn6\" (UniqueName: \"kubernetes.io/projected/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-kube-api-access-r2sn6\") pod \"certified-operators-xxhfh\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.412134 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.417111 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bsfw6"] Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.418025 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.467831 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.468398 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-catalog-content\") pod \"community-operators-bsfw6\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.468502 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-utilities\") pod \"community-operators-bsfw6\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.468529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjjhq\" (UniqueName: \"kubernetes.io/projected/0c0109c8-5421-4dc7-8e10-1843689bc9f9-kube-api-access-sjjhq\") pod \"community-operators-bsfw6\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: E1007 13:56:06.468619 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:06.968604406 +0000 UTC m=+148.796530198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.524077 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bsfw6"] Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.571792 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.571855 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-utilities\") pod \"community-operators-bsfw6\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.571877 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjjhq\" (UniqueName: \"kubernetes.io/projected/0c0109c8-5421-4dc7-8e10-1843689bc9f9-kube-api-access-sjjhq\") pod \"community-operators-bsfw6\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.571924 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-catalog-content\") pod \"community-operators-bsfw6\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.572310 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-catalog-content\") pod \"community-operators-bsfw6\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: E1007 13:56:06.572548 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:07.072537401 +0000 UTC m=+148.900463193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.572854 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-utilities\") pod \"community-operators-bsfw6\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.577134 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.615392 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tlzcg"] Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.619428 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.632491 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjjhq\" (UniqueName: \"kubernetes.io/projected/0c0109c8-5421-4dc7-8e10-1843689bc9f9-kube-api-access-sjjhq\") pod \"community-operators-bsfw6\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.635183 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tlzcg"] Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.672547 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.673354 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-catalog-content\") pod \"certified-operators-tlzcg\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.673469 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfmfg\" (UniqueName: \"kubernetes.io/projected/d08a38f6-5d00-440e-9023-5a586d206de3-kube-api-access-hfmfg\") pod \"certified-operators-tlzcg\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.673599 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-utilities\") pod \"certified-operators-tlzcg\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: E1007 13:56:06.673802 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:07.173781714 +0000 UTC m=+149.001707506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.707485 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:06 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:06 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:06 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.707548 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.778321 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.778386 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-catalog-content\") pod \"certified-operators-tlzcg\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.778405 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfmfg\" (UniqueName: \"kubernetes.io/projected/d08a38f6-5d00-440e-9023-5a586d206de3-kube-api-access-hfmfg\") pod \"certified-operators-tlzcg\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.778420 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-utilities\") pod \"certified-operators-tlzcg\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.779083 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-utilities\") pod \"certified-operators-tlzcg\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: E1007 13:56:06.779310 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:07.279299002 +0000 UTC m=+149.107224794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.779517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-catalog-content\") pod \"certified-operators-tlzcg\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.782207 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.826056 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfmfg\" (UniqueName: \"kubernetes.io/projected/d08a38f6-5d00-440e-9023-5a586d206de3-kube-api-access-hfmfg\") pod \"certified-operators-tlzcg\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.879733 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.879870 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.879919 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.879972 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:56:06 crc kubenswrapper[4717]: E1007 13:56:06.882995 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:07.38297011 +0000 UTC m=+149.210895902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.889592 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.890583 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.891382 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.951337 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.966900 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" event={"ID":"bea65594-0f14-4f94-86c9-6f3ae50daf32","Type":"ContainerStarted","Data":"572af5b9d70179a388dbf3673d34110dc553672bd7616aca0ee7394e1c55b981"} Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.966934 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" event={"ID":"bea65594-0f14-4f94-86c9-6f3ae50daf32","Type":"ContainerStarted","Data":"897e8639d58548fb4811b9155e5ac899c0f73da41c8cf09b95f400ee4eed2c08"} Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.990000 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.990081 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:06 crc kubenswrapper[4717]: E1007 13:56:06.990380 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:07.490364738 +0000 UTC m=+149.318290530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.991877 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-2r2zr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 07 13:56:06 crc kubenswrapper[4717]: I1007 13:56:06.991915 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2r2zr" podUID="757f147e-b766-4200-9a8e-c4fbad44fed0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.003925 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.089167 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.091641 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:07 crc kubenswrapper[4717]: E1007 13:56:07.096732 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:07.596711478 +0000 UTC m=+149.424637270 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.105230 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.123267 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.193772 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:07 crc kubenswrapper[4717]: E1007 13:56:07.194118 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:07.694105899 +0000 UTC m=+149.522031681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.205837 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gmhlf"] Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.296628 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:07 crc kubenswrapper[4717]: E1007 13:56:07.296900 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:07.796885313 +0000 UTC m=+149.624811105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.397870 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:07 crc kubenswrapper[4717]: E1007 13:56:07.398586 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:07.898571328 +0000 UTC m=+149.726497120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.404478 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xxhfh"] Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.485250 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bsfw6"] Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.500356 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:07 crc kubenswrapper[4717]: E1007 13:56:07.500781 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:08.000705215 +0000 UTC m=+149.828631007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.577528 4717 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.603483 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:07 crc kubenswrapper[4717]: E1007 13:56:07.604216 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:08.104201499 +0000 UTC m=+149.932127291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.705641 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:07 crc kubenswrapper[4717]: E1007 13:56:07.705950 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:08.205926025 +0000 UTC m=+150.033851817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.709144 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tlzcg"] Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.714709 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:07 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:07 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:07 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.714779 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.807985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:07 crc kubenswrapper[4717]: E1007 13:56:07.808373 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:08.30836144 +0000 UTC m=+150.136287232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:07 crc kubenswrapper[4717]: W1007 13:56:07.826173 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-88f74bd436d8869f7f7d61443dfaa485eb8f2f13dbc2636230a27002a256d8a1 WatchSource:0}: Error finding container 88f74bd436d8869f7f7d61443dfaa485eb8f2f13dbc2636230a27002a256d8a1: Status 404 returned error can't find the container with id 88f74bd436d8869f7f7d61443dfaa485eb8f2f13dbc2636230a27002a256d8a1 Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.910968 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:07 crc kubenswrapper[4717]: E1007 13:56:07.911221 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:08.411202436 +0000 UTC m=+150.239128228 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.911280 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:07 crc kubenswrapper[4717]: E1007 13:56:07.911574 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-07 13:56:08.411567626 +0000 UTC m=+150.239493408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncngn" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.985516 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"88f74bd436d8869f7f7d61443dfaa485eb8f2f13dbc2636230a27002a256d8a1"} Oct 07 13:56:07 crc kubenswrapper[4717]: I1007 13:56:07.994251 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlzcg" event={"ID":"d08a38f6-5d00-440e-9023-5a586d206de3","Type":"ContainerStarted","Data":"9b877c1ec0ef1de97f2f0ef61e82e427abd798c8ae24d3f54b73e85e3e81c1c3"} Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.007645 4717 generic.go:334] "Generic (PLEG): container finished" podID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerID="92d51aab409ddba4e138dde39686d8197bb79c6df675753558048fc0109cfc7c" exitCode=0 Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.007750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bsfw6" event={"ID":"0c0109c8-5421-4dc7-8e10-1843689bc9f9","Type":"ContainerDied","Data":"92d51aab409ddba4e138dde39686d8197bb79c6df675753558048fc0109cfc7c"} Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.007775 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bsfw6" event={"ID":"0c0109c8-5421-4dc7-8e10-1843689bc9f9","Type":"ContainerStarted","Data":"ee8c3349c8956d5c2962c225aec9e14446a8c925096924a92c4f4a4dbd620b20"} Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.012482 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:08 crc kubenswrapper[4717]: E1007 13:56:08.012879 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-07 13:56:08.512862941 +0000 UTC m=+150.340788733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.022889 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.035073 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" event={"ID":"bea65594-0f14-4f94-86c9-6f3ae50daf32","Type":"ContainerStarted","Data":"f1295fd1dd7a02a9ad9e4f6b278285a125540a4702aa682d4f2bc219baad7ef6"} Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.037752 4717 generic.go:334] "Generic (PLEG): container finished" podID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerID="3a9f49de65c6f063b3136e0e5b2314a4eb9e7389275f7d7cd5f388136bb07f67" exitCode=0 Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.037823 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmhlf" event={"ID":"dd8359d4-1d07-4d74-955a-25c5471f3817","Type":"ContainerDied","Data":"3a9f49de65c6f063b3136e0e5b2314a4eb9e7389275f7d7cd5f388136bb07f67"} Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.037839 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmhlf" event={"ID":"dd8359d4-1d07-4d74-955a-25c5471f3817","Type":"ContainerStarted","Data":"2a828b1a1d7b6c08f7fce9a52ec3497a478a36a896dbdad751a207903e0a8048"} Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.048186 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhfh" event={"ID":"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb","Type":"ContainerStarted","Data":"08bee7e9be5834c9b7a875ab58291d9262aa7392c37a0adb8158e2363b32ad90"} Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.048238 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhfh" event={"ID":"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb","Type":"ContainerStarted","Data":"05b0af3b86a5255a59f51f2b28f695f3ab288e7049f1c87260541cb5223426f8"} Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.062769 4717 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-07T13:56:07.577558215Z","Handler":null,"Name":""} Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.076201 4717 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.076246 4717 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.096779 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jjtwt" podStartSLOduration=10.096745249 podStartE2EDuration="10.096745249s" podCreationTimestamp="2025-10-07 13:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:08.077887733 +0000 UTC m=+149.905813525" watchObservedRunningTime="2025-10-07 13:56:08.096745249 +0000 UTC m=+149.924671041" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.134767 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.152160 4717 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.152203 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.207271 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rrhrm"] Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.208496 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.210644 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.223888 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrhrm"] Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.240865 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncngn\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.337223 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.343392 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.343815 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-utilities\") pod \"redhat-marketplace-rrhrm\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.343934 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzb9z\" (UniqueName: \"kubernetes.io/projected/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-kube-api-access-bzb9z\") pod \"redhat-marketplace-rrhrm\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.344085 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-catalog-content\") pod \"redhat-marketplace-rrhrm\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.400497 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.444831 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzb9z\" (UniqueName: \"kubernetes.io/projected/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-kube-api-access-bzb9z\") pod \"redhat-marketplace-rrhrm\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.444914 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-catalog-content\") pod \"redhat-marketplace-rrhrm\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.444983 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-utilities\") pod \"redhat-marketplace-rrhrm\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.445531 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-utilities\") pod \"redhat-marketplace-rrhrm\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.445663 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-catalog-content\") pod \"redhat-marketplace-rrhrm\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.477546 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzb9z\" (UniqueName: \"kubernetes.io/projected/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-kube-api-access-bzb9z\") pod \"redhat-marketplace-rrhrm\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.562959 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.620316 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xq2zf"] Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.622075 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.625745 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xq2zf"] Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.688790 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.690741 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.695609 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.695828 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.703751 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:08 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:08 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:08 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.703792 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.708991 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncngn"] Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.714819 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 13:56:08 crc kubenswrapper[4717]: W1007 13:56:08.720343 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3c6c7fc_eddf_402e_b2ce_96ae0cf04642.slice/crio-8127077191be677d3357eda3f1db046d124cc93cf656bea190c3ea6d4c7d19f8 WatchSource:0}: Error finding container 8127077191be677d3357eda3f1db046d124cc93cf656bea190c3ea6d4c7d19f8: Status 404 returned error can't find the container with id 8127077191be677d3357eda3f1db046d124cc93cf656bea190c3ea6d4c7d19f8 Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.750618 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-utilities\") pod \"redhat-marketplace-xq2zf\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.750690 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-catalog-content\") pod \"redhat-marketplace-xq2zf\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.750721 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nt7s\" (UniqueName: \"kubernetes.io/projected/dd94dd61-61e7-473c-baf4-60213f5cd072-kube-api-access-8nt7s\") pod \"redhat-marketplace-xq2zf\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.806238 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrhrm"] Oct 07 13:56:08 crc kubenswrapper[4717]: W1007 13:56:08.816239 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9208ec_98a7_48f1_a52e_b7b188e35aa5.slice/crio-7dd6738d4523a135d5a4cf3189be80192ae0d6990e8061d4320b04f405550079 WatchSource:0}: Error finding container 7dd6738d4523a135d5a4cf3189be80192ae0d6990e8061d4320b04f405550079: Status 404 returned error can't find the container with id 7dd6738d4523a135d5a4cf3189be80192ae0d6990e8061d4320b04f405550079 Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.851983 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6188724-8aba-4097-bba3-be0a6c239263-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d6188724-8aba-4097-bba3-be0a6c239263\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.852065 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-utilities\") pod \"redhat-marketplace-xq2zf\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.852111 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-catalog-content\") pod \"redhat-marketplace-xq2zf\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.852132 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nt7s\" (UniqueName: \"kubernetes.io/projected/dd94dd61-61e7-473c-baf4-60213f5cd072-kube-api-access-8nt7s\") pod \"redhat-marketplace-xq2zf\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.852157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6188724-8aba-4097-bba3-be0a6c239263-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d6188724-8aba-4097-bba3-be0a6c239263\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.852560 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-utilities\") pod \"redhat-marketplace-xq2zf\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.852580 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-catalog-content\") pod \"redhat-marketplace-xq2zf\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.874866 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nt7s\" (UniqueName: \"kubernetes.io/projected/dd94dd61-61e7-473c-baf4-60213f5cd072-kube-api-access-8nt7s\") pod \"redhat-marketplace-xq2zf\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.883212 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.945928 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.953891 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6188724-8aba-4097-bba3-be0a6c239263-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d6188724-8aba-4097-bba3-be0a6c239263\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.953992 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6188724-8aba-4097-bba3-be0a6c239263-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d6188724-8aba-4097-bba3-be0a6c239263\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.954407 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6188724-8aba-4097-bba3-be0a6c239263-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d6188724-8aba-4097-bba3-be0a6c239263\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:56:08 crc kubenswrapper[4717]: I1007 13:56:08.984669 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6188724-8aba-4097-bba3-be0a6c239263-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d6188724-8aba-4097-bba3-be0a6c239263\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.011915 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.067663 4717 generic.go:334] "Generic (PLEG): container finished" podID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerID="08bee7e9be5834c9b7a875ab58291d9262aa7392c37a0adb8158e2363b32ad90" exitCode=0 Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.067724 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhfh" event={"ID":"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb","Type":"ContainerDied","Data":"08bee7e9be5834c9b7a875ab58291d9262aa7392c37a0adb8158e2363b32ad90"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.073636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2a3cbffcd1dd4f2a94ac20f95a9b2fefc864c8bb7c04c0d2f7c7f4cb5a4e12b5"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.073665 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"da3688b968d343702a8bb1141c864b9e9070d8122e3d5aa0f255d88575632ec2"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.074415 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.077297 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e7fb409a0ba485da6fc600a203ea4e6c183e63085681d1ce8bfe8af9a2717a98"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.079494 4717 generic.go:334] "Generic (PLEG): container finished" podID="d08a38f6-5d00-440e-9023-5a586d206de3" containerID="ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec" exitCode=0 Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.079722 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlzcg" event={"ID":"d08a38f6-5d00-440e-9023-5a586d206de3","Type":"ContainerDied","Data":"ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.082166 4717 generic.go:334] "Generic (PLEG): container finished" podID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerID="b31c229f9db5266e57340081e676b0e0dd92d4d8b9a1fa051f95a1eb40121225" exitCode=0 Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.082344 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrhrm" event={"ID":"0e9208ec-98a7-48f1-a52e-b7b188e35aa5","Type":"ContainerDied","Data":"b31c229f9db5266e57340081e676b0e0dd92d4d8b9a1fa051f95a1eb40121225"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.082389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrhrm" event={"ID":"0e9208ec-98a7-48f1-a52e-b7b188e35aa5","Type":"ContainerStarted","Data":"7dd6738d4523a135d5a4cf3189be80192ae0d6990e8061d4320b04f405550079"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.086034 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8816a3bba7f31603c9a04057a5fc13421090adb92fc68044a5470afaf7a30171"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.086084 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c814c4dc1347892b19999fc3c9e9eb69165171432286c5d42c32b4a3630d49d3"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.094082 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" event={"ID":"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642","Type":"ContainerStarted","Data":"8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.094121 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" event={"ID":"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642","Type":"ContainerStarted","Data":"8127077191be677d3357eda3f1db046d124cc93cf656bea190c3ea6d4c7d19f8"} Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.094140 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.241704 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p7c4n"] Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.243294 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.254640 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.318490 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7c4n"] Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.345118 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" podStartSLOduration=130.341000913 podStartE2EDuration="2m10.341000913s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:09.322868427 +0000 UTC m=+151.150794239" watchObservedRunningTime="2025-10-07 13:56:09.341000913 +0000 UTC m=+151.168926705" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.360413 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-utilities\") pod \"redhat-operators-p7c4n\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.360476 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llz6j\" (UniqueName: \"kubernetes.io/projected/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-kube-api-access-llz6j\") pod \"redhat-operators-p7c4n\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.360636 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-catalog-content\") pod \"redhat-operators-p7c4n\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.366166 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xq2zf"] Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.463087 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-catalog-content\") pod \"redhat-operators-p7c4n\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.463222 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-utilities\") pod \"redhat-operators-p7c4n\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.463274 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llz6j\" (UniqueName: \"kubernetes.io/projected/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-kube-api-access-llz6j\") pod \"redhat-operators-p7c4n\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.464823 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-catalog-content\") pod \"redhat-operators-p7c4n\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.465296 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-utilities\") pod \"redhat-operators-p7c4n\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.484837 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llz6j\" (UniqueName: \"kubernetes.io/projected/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-kube-api-access-llz6j\") pod \"redhat-operators-p7c4n\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.508880 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 07 13:56:09 crc kubenswrapper[4717]: W1007 13:56:09.527646 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd6188724_8aba_4097_bba3_be0a6c239263.slice/crio-d4b42e26cb9acc7fbcaa8d2d3e85158ffb834b4d47aff0e4e3fc2792e78bb855 WatchSource:0}: Error finding container d4b42e26cb9acc7fbcaa8d2d3e85158ffb834b4d47aff0e4e3fc2792e78bb855: Status 404 returned error can't find the container with id d4b42e26cb9acc7fbcaa8d2d3e85158ffb834b4d47aff0e4e3fc2792e78bb855 Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.605226 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.609743 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8gqgw"] Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.622510 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.635501 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8gqgw"] Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.714432 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:09 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:09 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:09 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.714490 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.770645 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6754k\" (UniqueName: \"kubernetes.io/projected/144e34a6-69de-495d-afc2-96b1580f599f-kube-api-access-6754k\") pod \"redhat-operators-8gqgw\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.770929 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-utilities\") pod \"redhat-operators-8gqgw\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.771069 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-catalog-content\") pod \"redhat-operators-8gqgw\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.783605 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b55vq" Oct 07 13:56:09 crc kubenswrapper[4717]: E1007 13:56:09.807448 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd94dd61_61e7_473c_baf4_60213f5cd072.slice/crio-conmon-245da47b479c2306a320ef7e82841b8f9afe690e6606254998daee54adc7d0bb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd94dd61_61e7_473c_baf4_60213f5cd072.slice/crio-245da47b479c2306a320ef7e82841b8f9afe690e6606254998daee54adc7d0bb.scope\": RecentStats: unable to find data in memory cache]" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.872698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-utilities\") pod \"redhat-operators-8gqgw\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.872800 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-catalog-content\") pod \"redhat-operators-8gqgw\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.873044 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6754k\" (UniqueName: \"kubernetes.io/projected/144e34a6-69de-495d-afc2-96b1580f599f-kube-api-access-6754k\") pod \"redhat-operators-8gqgw\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.873933 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-utilities\") pod \"redhat-operators-8gqgw\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.874154 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-catalog-content\") pod \"redhat-operators-8gqgw\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:09 crc kubenswrapper[4717]: I1007 13:56:09.902919 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6754k\" (UniqueName: \"kubernetes.io/projected/144e34a6-69de-495d-afc2-96b1580f599f-kube-api-access-6754k\") pod \"redhat-operators-8gqgw\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.017472 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.090504 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7c4n"] Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.196088 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d6188724-8aba-4097-bba3-be0a6c239263","Type":"ContainerStarted","Data":"d4b42e26cb9acc7fbcaa8d2d3e85158ffb834b4d47aff0e4e3fc2792e78bb855"} Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.199234 4717 generic.go:334] "Generic (PLEG): container finished" podID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerID="245da47b479c2306a320ef7e82841b8f9afe690e6606254998daee54adc7d0bb" exitCode=0 Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.200799 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xq2zf" event={"ID":"dd94dd61-61e7-473c-baf4-60213f5cd072","Type":"ContainerDied","Data":"245da47b479c2306a320ef7e82841b8f9afe690e6606254998daee54adc7d0bb"} Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.200823 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xq2zf" event={"ID":"dd94dd61-61e7-473c-baf4-60213f5cd072","Type":"ContainerStarted","Data":"5be3b509387a30095ba8e37c3247d3267bc9499c75dad07acad5a9a1e062a8e9"} Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.336526 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8gqgw"] Oct 07 13:56:10 crc kubenswrapper[4717]: W1007 13:56:10.354720 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod144e34a6_69de_495d_afc2_96b1580f599f.slice/crio-3f4059f035bf2ec83183150667443611dd76ab50cf9d3bc8a81163d4480d0e51 WatchSource:0}: Error finding container 3f4059f035bf2ec83183150667443611dd76ab50cf9d3bc8a81163d4480d0e51: Status 404 returned error can't find the container with id 3f4059f035bf2ec83183150667443611dd76ab50cf9d3bc8a81163d4480d0e51 Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.438085 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.438123 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.442031 4717 patch_prober.go:28] interesting pod/console-f9d7485db-k6dvg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.442071 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-k6dvg" podUID="cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.549385 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.550127 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.562293 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.698682 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.701724 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:10 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:10 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:10 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:10 crc kubenswrapper[4717]: I1007 13:56:10.702139 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.210641 4717 generic.go:334] "Generic (PLEG): container finished" podID="d6188724-8aba-4097-bba3-be0a6c239263" containerID="421b10356c1c39ab9c87486d81a090d48f55fa0122a34dda07643c8cbc6db0a3" exitCode=0 Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.210760 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d6188724-8aba-4097-bba3-be0a6c239263","Type":"ContainerDied","Data":"421b10356c1c39ab9c87486d81a090d48f55fa0122a34dda07643c8cbc6db0a3"} Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.220058 4717 generic.go:334] "Generic (PLEG): container finished" podID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerID="852d82f9d7c6abafb0cfa3da43f18d8c8f17cfe3662094fe89c0e4979298cad0" exitCode=0 Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.220135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7c4n" event={"ID":"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2","Type":"ContainerDied","Data":"852d82f9d7c6abafb0cfa3da43f18d8c8f17cfe3662094fe89c0e4979298cad0"} Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.220163 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7c4n" event={"ID":"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2","Type":"ContainerStarted","Data":"c5f7b37a11461bd67f1bd90de330c12738b9b6da8da611a979609f5968acb062"} Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.225496 4717 generic.go:334] "Generic (PLEG): container finished" podID="144e34a6-69de-495d-afc2-96b1580f599f" containerID="8ad789fbb4f38aa935b756ff6624d1cd7395c70d31251bbc495d77664b47def7" exitCode=0 Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.227223 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gqgw" event={"ID":"144e34a6-69de-495d-afc2-96b1580f599f","Type":"ContainerDied","Data":"8ad789fbb4f38aa935b756ff6624d1cd7395c70d31251bbc495d77664b47def7"} Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.227249 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gqgw" event={"ID":"144e34a6-69de-495d-afc2-96b1580f599f","Type":"ContainerStarted","Data":"3f4059f035bf2ec83183150667443611dd76ab50cf9d3bc8a81163d4480d0e51"} Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.235094 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-q2jhk" Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.637134 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-2r2zr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.637193 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2r2zr" podUID="757f147e-b766-4200-9a8e-c4fbad44fed0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.637134 4717 patch_prober.go:28] interesting pod/downloads-7954f5f757-2r2zr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.637609 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2r2zr" podUID="757f147e-b766-4200-9a8e-c4fbad44fed0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.701160 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:11 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:11 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:11 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:11 crc kubenswrapper[4717]: I1007 13:56:11.701239 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.247072 4717 generic.go:334] "Generic (PLEG): container finished" podID="37c53362-c687-4727-97b9-95a9a358cf5b" containerID="8b01b86cce3843dc7fc1f14264207948424bd64f2051d7f85e0867b3cd4fed82" exitCode=0 Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.247161 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" event={"ID":"37c53362-c687-4727-97b9-95a9a358cf5b","Type":"ContainerDied","Data":"8b01b86cce3843dc7fc1f14264207948424bd64f2051d7f85e0867b3cd4fed82"} Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.700194 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:12 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:12 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:12 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.700559 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.726728 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.823617 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6188724-8aba-4097-bba3-be0a6c239263-kube-api-access\") pod \"d6188724-8aba-4097-bba3-be0a6c239263\" (UID: \"d6188724-8aba-4097-bba3-be0a6c239263\") " Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.823688 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6188724-8aba-4097-bba3-be0a6c239263-kubelet-dir\") pod \"d6188724-8aba-4097-bba3-be0a6c239263\" (UID: \"d6188724-8aba-4097-bba3-be0a6c239263\") " Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.824047 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6188724-8aba-4097-bba3-be0a6c239263-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d6188724-8aba-4097-bba3-be0a6c239263" (UID: "d6188724-8aba-4097-bba3-be0a6c239263"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.836494 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6188724-8aba-4097-bba3-be0a6c239263-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d6188724-8aba-4097-bba3-be0a6c239263" (UID: "d6188724-8aba-4097-bba3-be0a6c239263"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.934910 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d6188724-8aba-4097-bba3-be0a6c239263-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:12 crc kubenswrapper[4717]: I1007 13:56:12.934955 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d6188724-8aba-4097-bba3-be0a6c239263-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.279963 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.280522 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d6188724-8aba-4097-bba3-be0a6c239263","Type":"ContainerDied","Data":"d4b42e26cb9acc7fbcaa8d2d3e85158ffb834b4d47aff0e4e3fc2792e78bb855"} Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.280545 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4b42e26cb9acc7fbcaa8d2d3e85158ffb834b4d47aff0e4e3fc2792e78bb855" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.376845 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gpv6z" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.649055 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.700351 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:13 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:13 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:13 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.700439 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.748433 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37c53362-c687-4727-97b9-95a9a358cf5b-config-volume\") pod \"37c53362-c687-4727-97b9-95a9a358cf5b\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.748474 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37c53362-c687-4727-97b9-95a9a358cf5b-secret-volume\") pod \"37c53362-c687-4727-97b9-95a9a358cf5b\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.748733 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkgp5\" (UniqueName: \"kubernetes.io/projected/37c53362-c687-4727-97b9-95a9a358cf5b-kube-api-access-rkgp5\") pod \"37c53362-c687-4727-97b9-95a9a358cf5b\" (UID: \"37c53362-c687-4727-97b9-95a9a358cf5b\") " Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.749350 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37c53362-c687-4727-97b9-95a9a358cf5b-config-volume" (OuterVolumeSpecName: "config-volume") pod "37c53362-c687-4727-97b9-95a9a358cf5b" (UID: "37c53362-c687-4727-97b9-95a9a358cf5b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.754817 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c53362-c687-4727-97b9-95a9a358cf5b-kube-api-access-rkgp5" (OuterVolumeSpecName: "kube-api-access-rkgp5") pod "37c53362-c687-4727-97b9-95a9a358cf5b" (UID: "37c53362-c687-4727-97b9-95a9a358cf5b"). InnerVolumeSpecName "kube-api-access-rkgp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.767078 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c53362-c687-4727-97b9-95a9a358cf5b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "37c53362-c687-4727-97b9-95a9a358cf5b" (UID: "37c53362-c687-4727-97b9-95a9a358cf5b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.852587 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkgp5\" (UniqueName: \"kubernetes.io/projected/37c53362-c687-4727-97b9-95a9a358cf5b-kube-api-access-rkgp5\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.852627 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37c53362-c687-4727-97b9-95a9a358cf5b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.852654 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37c53362-c687-4727-97b9-95a9a358cf5b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.942133 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 13:56:13 crc kubenswrapper[4717]: E1007 13:56:13.942454 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6188724-8aba-4097-bba3-be0a6c239263" containerName="pruner" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.942505 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6188724-8aba-4097-bba3-be0a6c239263" containerName="pruner" Oct 07 13:56:13 crc kubenswrapper[4717]: E1007 13:56:13.942524 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c53362-c687-4727-97b9-95a9a358cf5b" containerName="collect-profiles" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.942533 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c53362-c687-4727-97b9-95a9a358cf5b" containerName="collect-profiles" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.942701 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6188724-8aba-4097-bba3-be0a6c239263" containerName="pruner" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.942735 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c53362-c687-4727-97b9-95a9a358cf5b" containerName="collect-profiles" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.944134 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.947555 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.947821 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 07 13:56:13 crc kubenswrapper[4717]: I1007 13:56:13.959493 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.054642 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"79a298ef-4a4c-43e7-973b-333fc7f0bfe4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.054697 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"79a298ef-4a4c-43e7-973b-333fc7f0bfe4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.155362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"79a298ef-4a4c-43e7-973b-333fc7f0bfe4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.155438 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"79a298ef-4a4c-43e7-973b-333fc7f0bfe4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.155570 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"79a298ef-4a4c-43e7-973b-333fc7f0bfe4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.176381 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"79a298ef-4a4c-43e7-973b-333fc7f0bfe4\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.264805 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.322642 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" event={"ID":"37c53362-c687-4727-97b9-95a9a358cf5b","Type":"ContainerDied","Data":"4af5c2b8411603202d7e90871156a82dea6de2d3e3281f627e73dc8f09fda1d1"} Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.323050 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4af5c2b8411603202d7e90871156a82dea6de2d3e3281f627e73dc8f09fda1d1" Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.323313 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq" Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.700283 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:14 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:14 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:14 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:14 crc kubenswrapper[4717]: I1007 13:56:14.700343 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:15 crc kubenswrapper[4717]: I1007 13:56:15.701776 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:15 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:15 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:15 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:15 crc kubenswrapper[4717]: I1007 13:56:15.702138 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:16 crc kubenswrapper[4717]: I1007 13:56:16.700303 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:16 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:16 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:16 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:16 crc kubenswrapper[4717]: I1007 13:56:16.700756 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:17 crc kubenswrapper[4717]: I1007 13:56:17.700586 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:17 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:17 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:17 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:17 crc kubenswrapper[4717]: I1007 13:56:17.700660 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:18 crc kubenswrapper[4717]: I1007 13:56:18.186841 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 13:56:18 crc kubenswrapper[4717]: I1007 13:56:18.701416 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:18 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:18 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:18 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:18 crc kubenswrapper[4717]: I1007 13:56:18.701472 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:19 crc kubenswrapper[4717]: I1007 13:56:19.700271 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:19 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:19 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:19 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:19 crc kubenswrapper[4717]: I1007 13:56:19.700391 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:20 crc kubenswrapper[4717]: I1007 13:56:20.438133 4717 patch_prober.go:28] interesting pod/console-f9d7485db-k6dvg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Oct 07 13:56:20 crc kubenswrapper[4717]: I1007 13:56:20.438440 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-k6dvg" podUID="cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Oct 07 13:56:20 crc kubenswrapper[4717]: I1007 13:56:20.701288 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:20 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:20 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:20 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:20 crc kubenswrapper[4717]: I1007 13:56:20.701395 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:21 crc kubenswrapper[4717]: I1007 13:56:21.641972 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2r2zr" Oct 07 13:56:21 crc kubenswrapper[4717]: I1007 13:56:21.700594 4717 patch_prober.go:28] interesting pod/router-default-5444994796-2v924 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 07 13:56:21 crc kubenswrapper[4717]: [-]has-synced failed: reason withheld Oct 07 13:56:21 crc kubenswrapper[4717]: [+]process-running ok Oct 07 13:56:21 crc kubenswrapper[4717]: healthz check failed Oct 07 13:56:21 crc kubenswrapper[4717]: I1007 13:56:21.700653 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2v924" podUID="84d6de19-34a7-495c-be2e-71b777330cee" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 07 13:56:22 crc kubenswrapper[4717]: I1007 13:56:22.110883 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:56:22 crc kubenswrapper[4717]: I1007 13:56:22.116399 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/004bf989-60a1-4a45-bb4d-fc6a41829f3d-metrics-certs\") pod \"network-metrics-daemon-vl8rk\" (UID: \"004bf989-60a1-4a45-bb4d-fc6a41829f3d\") " pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:56:22 crc kubenswrapper[4717]: I1007 13:56:22.416445 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vl8rk" Oct 07 13:56:22 crc kubenswrapper[4717]: I1007 13:56:22.702267 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:56:22 crc kubenswrapper[4717]: I1007 13:56:22.704108 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-2v924" Oct 07 13:56:28 crc kubenswrapper[4717]: I1007 13:56:28.342242 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 13:56:30 crc kubenswrapper[4717]: I1007 13:56:30.482476 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:56:30 crc kubenswrapper[4717]: I1007 13:56:30.486087 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 13:56:31 crc kubenswrapper[4717]: I1007 13:56:31.609859 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:56:31 crc kubenswrapper[4717]: I1007 13:56:31.609923 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:56:39 crc kubenswrapper[4717]: E1007 13:56:39.120280 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 13:56:39 crc kubenswrapper[4717]: E1007 13:56:39.120915 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nljwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gmhlf_openshift-marketplace(dd8359d4-1d07-4d74-955a-25c5471f3817): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:56:39 crc kubenswrapper[4717]: E1007 13:56:39.122077 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gmhlf" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" Oct 07 13:56:40 crc kubenswrapper[4717]: I1007 13:56:40.635677 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-nvx87" Oct 07 13:56:41 crc kubenswrapper[4717]: E1007 13:56:41.529892 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gmhlf" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" Oct 07 13:56:41 crc kubenswrapper[4717]: E1007 13:56:41.636284 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 07 13:56:41 crc kubenswrapper[4717]: E1007 13:56:41.636623 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llz6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-p7c4n_openshift-marketplace(b9a5adf2-d75a-4f7f-94a6-55cdbc781de2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:56:41 crc kubenswrapper[4717]: E1007 13:56:41.638168 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-p7c4n" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" Oct 07 13:56:41 crc kubenswrapper[4717]: E1007 13:56:41.644257 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 07 13:56:41 crc kubenswrapper[4717]: E1007 13:56:41.644462 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hfmfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tlzcg_openshift-marketplace(d08a38f6-5d00-440e-9023-5a586d206de3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:56:41 crc kubenswrapper[4717]: E1007 13:56:41.645651 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tlzcg" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" Oct 07 13:56:41 crc kubenswrapper[4717]: E1007 13:56:41.658164 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 07 13:56:41 crc kubenswrapper[4717]: E1007 13:56:41.658485 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjjhq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bsfw6_openshift-marketplace(0c0109c8-5421-4dc7-8e10-1843689bc9f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:56:41 crc kubenswrapper[4717]: E1007 13:56:41.659730 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bsfw6" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" Oct 07 13:56:41 crc kubenswrapper[4717]: I1007 13:56:41.932761 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 07 13:56:42 crc kubenswrapper[4717]: W1007 13:56:42.202855 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod79a298ef_4a4c_43e7_973b_333fc7f0bfe4.slice/crio-f8f42001a7056fda430acebae07f411eccd1f7ca26fe85eec9e83b9fa4bf6d3d WatchSource:0}: Error finding container f8f42001a7056fda430acebae07f411eccd1f7ca26fe85eec9e83b9fa4bf6d3d: Status 404 returned error can't find the container with id f8f42001a7056fda430acebae07f411eccd1f7ca26fe85eec9e83b9fa4bf6d3d Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.273970 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.274410 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8nt7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-xq2zf_openshift-marketplace(dd94dd61-61e7-473c-baf4-60213f5cd072): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.275705 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-xq2zf" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.317149 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.317358 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bzb9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rrhrm_openshift-marketplace(0e9208ec-98a7-48f1-a52e-b7b188e35aa5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.318846 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rrhrm" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" Oct 07 13:56:42 crc kubenswrapper[4717]: I1007 13:56:42.350302 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vl8rk"] Oct 07 13:56:42 crc kubenswrapper[4717]: I1007 13:56:42.520207 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gqgw" event={"ID":"144e34a6-69de-495d-afc2-96b1580f599f","Type":"ContainerStarted","Data":"73d72efe6109bf3e72f1232fd404031adb4623bf464da296aef57dc394de0d48"} Oct 07 13:56:42 crc kubenswrapper[4717]: I1007 13:56:42.524361 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhfh" event={"ID":"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb","Type":"ContainerStarted","Data":"cbdccfe89d1cb45f3ffa7aa71faf32be4a5d36b1332faa7ba8a91af275328caf"} Oct 07 13:56:42 crc kubenswrapper[4717]: I1007 13:56:42.534204 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" event={"ID":"004bf989-60a1-4a45-bb4d-fc6a41829f3d","Type":"ContainerStarted","Data":"4beb1cef064850fa00427e9f940ff7bd168a056aa24dff9177dafc05cb0727a0"} Oct 07 13:56:42 crc kubenswrapper[4717]: I1007 13:56:42.535936 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"79a298ef-4a4c-43e7-973b-333fc7f0bfe4","Type":"ContainerStarted","Data":"e687733a8405779bef311ddec699ee56aeb8bf2d93fe0c6476f3267e6abb5b61"} Oct 07 13:56:42 crc kubenswrapper[4717]: I1007 13:56:42.536040 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"79a298ef-4a4c-43e7-973b-333fc7f0bfe4","Type":"ContainerStarted","Data":"f8f42001a7056fda430acebae07f411eccd1f7ca26fe85eec9e83b9fa4bf6d3d"} Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.538298 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-xq2zf" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.538565 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tlzcg" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.539696 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-p7c4n" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.547783 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bsfw6" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" Oct 07 13:56:42 crc kubenswrapper[4717]: E1007 13:56:42.547955 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rrhrm" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" Oct 07 13:56:42 crc kubenswrapper[4717]: I1007 13:56:42.611056 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=29.611038054 podStartE2EDuration="29.611038054s" podCreationTimestamp="2025-10-07 13:56:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:42.609970105 +0000 UTC m=+184.437895917" watchObservedRunningTime="2025-10-07 13:56:42.611038054 +0000 UTC m=+184.438963846" Oct 07 13:56:43 crc kubenswrapper[4717]: I1007 13:56:43.541105 4717 generic.go:334] "Generic (PLEG): container finished" podID="144e34a6-69de-495d-afc2-96b1580f599f" containerID="73d72efe6109bf3e72f1232fd404031adb4623bf464da296aef57dc394de0d48" exitCode=0 Oct 07 13:56:43 crc kubenswrapper[4717]: I1007 13:56:43.542335 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gqgw" event={"ID":"144e34a6-69de-495d-afc2-96b1580f599f","Type":"ContainerDied","Data":"73d72efe6109bf3e72f1232fd404031adb4623bf464da296aef57dc394de0d48"} Oct 07 13:56:43 crc kubenswrapper[4717]: I1007 13:56:43.556235 4717 generic.go:334] "Generic (PLEG): container finished" podID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerID="cbdccfe89d1cb45f3ffa7aa71faf32be4a5d36b1332faa7ba8a91af275328caf" exitCode=0 Oct 07 13:56:43 crc kubenswrapper[4717]: I1007 13:56:43.556629 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhfh" event={"ID":"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb","Type":"ContainerDied","Data":"cbdccfe89d1cb45f3ffa7aa71faf32be4a5d36b1332faa7ba8a91af275328caf"} Oct 07 13:56:43 crc kubenswrapper[4717]: I1007 13:56:43.565375 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" event={"ID":"004bf989-60a1-4a45-bb4d-fc6a41829f3d","Type":"ContainerStarted","Data":"b728e7037f1aa3c466cab23d2ab4c3d1fb5b690eb2199ec9016783dd5641c99c"} Oct 07 13:56:43 crc kubenswrapper[4717]: I1007 13:56:43.565691 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vl8rk" event={"ID":"004bf989-60a1-4a45-bb4d-fc6a41829f3d","Type":"ContainerStarted","Data":"12a0f1a005e7b2a1c722a838a7cee33469f5c7505096e6054ce6dcb4ef12dc28"} Oct 07 13:56:43 crc kubenswrapper[4717]: I1007 13:56:43.567090 4717 generic.go:334] "Generic (PLEG): container finished" podID="79a298ef-4a4c-43e7-973b-333fc7f0bfe4" containerID="e687733a8405779bef311ddec699ee56aeb8bf2d93fe0c6476f3267e6abb5b61" exitCode=0 Oct 07 13:56:43 crc kubenswrapper[4717]: I1007 13:56:43.567149 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"79a298ef-4a4c-43e7-973b-333fc7f0bfe4","Type":"ContainerDied","Data":"e687733a8405779bef311ddec699ee56aeb8bf2d93fe0c6476f3267e6abb5b61"} Oct 07 13:56:43 crc kubenswrapper[4717]: I1007 13:56:43.599313 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vl8rk" podStartSLOduration=164.599295369 podStartE2EDuration="2m44.599295369s" podCreationTimestamp="2025-10-07 13:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:56:43.594897771 +0000 UTC m=+185.422823563" watchObservedRunningTime="2025-10-07 13:56:43.599295369 +0000 UTC m=+185.427221161" Oct 07 13:56:44 crc kubenswrapper[4717]: I1007 13:56:44.810528 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.010928 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kubelet-dir\") pod \"79a298ef-4a4c-43e7-973b-333fc7f0bfe4\" (UID: \"79a298ef-4a4c-43e7-973b-333fc7f0bfe4\") " Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.011029 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kube-api-access\") pod \"79a298ef-4a4c-43e7-973b-333fc7f0bfe4\" (UID: \"79a298ef-4a4c-43e7-973b-333fc7f0bfe4\") " Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.011608 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "79a298ef-4a4c-43e7-973b-333fc7f0bfe4" (UID: "79a298ef-4a4c-43e7-973b-333fc7f0bfe4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.017116 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "79a298ef-4a4c-43e7-973b-333fc7f0bfe4" (UID: "79a298ef-4a4c-43e7-973b-333fc7f0bfe4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.113070 4717 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.113106 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/79a298ef-4a4c-43e7-973b-333fc7f0bfe4-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.581635 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhfh" event={"ID":"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb","Type":"ContainerStarted","Data":"7b1182f8250f7eb1afd541024b7307f190c672a05751a077da236d63ba04057a"} Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.583786 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"79a298ef-4a4c-43e7-973b-333fc7f0bfe4","Type":"ContainerDied","Data":"f8f42001a7056fda430acebae07f411eccd1f7ca26fe85eec9e83b9fa4bf6d3d"} Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.583815 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8f42001a7056fda430acebae07f411eccd1f7ca26fe85eec9e83b9fa4bf6d3d" Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.583877 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.590279 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gqgw" event={"ID":"144e34a6-69de-495d-afc2-96b1580f599f","Type":"ContainerStarted","Data":"2e6f3f9c1ad43bdf4c1618259e063e9b3c140096c135ea9c3bf730c4e9b59a91"} Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.607741 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xxhfh" podStartSLOduration=3.033191294 podStartE2EDuration="39.607722003s" podCreationTimestamp="2025-10-07 13:56:06 +0000 UTC" firstStartedPulling="2025-10-07 13:56:08.062236484 +0000 UTC m=+149.890162266" lastFinishedPulling="2025-10-07 13:56:44.636767183 +0000 UTC m=+186.464692975" observedRunningTime="2025-10-07 13:56:45.602746039 +0000 UTC m=+187.430671831" watchObservedRunningTime="2025-10-07 13:56:45.607722003 +0000 UTC m=+187.435647795" Oct 07 13:56:45 crc kubenswrapper[4717]: I1007 13:56:45.622673 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8gqgw" podStartSLOduration=3.232421131 podStartE2EDuration="36.622651423s" podCreationTimestamp="2025-10-07 13:56:09 +0000 UTC" firstStartedPulling="2025-10-07 13:56:11.228261471 +0000 UTC m=+153.056187253" lastFinishedPulling="2025-10-07 13:56:44.618491753 +0000 UTC m=+186.446417545" observedRunningTime="2025-10-07 13:56:45.619597171 +0000 UTC m=+187.447522973" watchObservedRunningTime="2025-10-07 13:56:45.622651423 +0000 UTC m=+187.450577215" Oct 07 13:56:46 crc kubenswrapper[4717]: I1007 13:56:46.578499 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:46 crc kubenswrapper[4717]: I1007 13:56:46.580814 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:46 crc kubenswrapper[4717]: I1007 13:56:46.724263 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:47 crc kubenswrapper[4717]: I1007 13:56:47.129801 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 07 13:56:50 crc kubenswrapper[4717]: I1007 13:56:50.018334 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:50 crc kubenswrapper[4717]: I1007 13:56:50.018671 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:50 crc kubenswrapper[4717]: I1007 13:56:50.053842 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:50 crc kubenswrapper[4717]: I1007 13:56:50.654034 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:51 crc kubenswrapper[4717]: I1007 13:56:51.754438 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8gqgw"] Oct 07 13:56:52 crc kubenswrapper[4717]: I1007 13:56:52.622017 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8gqgw" podUID="144e34a6-69de-495d-afc2-96b1580f599f" containerName="registry-server" containerID="cri-o://2e6f3f9c1ad43bdf4c1618259e063e9b3c140096c135ea9c3bf730c4e9b59a91" gracePeriod=2 Oct 07 13:56:54 crc kubenswrapper[4717]: I1007 13:56:54.635676 4717 generic.go:334] "Generic (PLEG): container finished" podID="144e34a6-69de-495d-afc2-96b1580f599f" containerID="2e6f3f9c1ad43bdf4c1618259e063e9b3c140096c135ea9c3bf730c4e9b59a91" exitCode=0 Oct 07 13:56:54 crc kubenswrapper[4717]: I1007 13:56:54.635819 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gqgw" event={"ID":"144e34a6-69de-495d-afc2-96b1580f599f","Type":"ContainerDied","Data":"2e6f3f9c1ad43bdf4c1618259e063e9b3c140096c135ea9c3bf730c4e9b59a91"} Oct 07 13:56:54 crc kubenswrapper[4717]: I1007 13:56:54.669025 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:54 crc kubenswrapper[4717]: I1007 13:56:54.832889 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6754k\" (UniqueName: \"kubernetes.io/projected/144e34a6-69de-495d-afc2-96b1580f599f-kube-api-access-6754k\") pod \"144e34a6-69de-495d-afc2-96b1580f599f\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " Oct 07 13:56:54 crc kubenswrapper[4717]: I1007 13:56:54.833260 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-catalog-content\") pod \"144e34a6-69de-495d-afc2-96b1580f599f\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " Oct 07 13:56:54 crc kubenswrapper[4717]: I1007 13:56:54.833332 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-utilities\") pod \"144e34a6-69de-495d-afc2-96b1580f599f\" (UID: \"144e34a6-69de-495d-afc2-96b1580f599f\") " Oct 07 13:56:54 crc kubenswrapper[4717]: I1007 13:56:54.834182 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-utilities" (OuterVolumeSpecName: "utilities") pod "144e34a6-69de-495d-afc2-96b1580f599f" (UID: "144e34a6-69de-495d-afc2-96b1580f599f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:56:54 crc kubenswrapper[4717]: I1007 13:56:54.839224 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/144e34a6-69de-495d-afc2-96b1580f599f-kube-api-access-6754k" (OuterVolumeSpecName: "kube-api-access-6754k") pod "144e34a6-69de-495d-afc2-96b1580f599f" (UID: "144e34a6-69de-495d-afc2-96b1580f599f"). InnerVolumeSpecName "kube-api-access-6754k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:56:54 crc kubenswrapper[4717]: I1007 13:56:54.934594 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:54 crc kubenswrapper[4717]: I1007 13:56:54.934637 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6754k\" (UniqueName: \"kubernetes.io/projected/144e34a6-69de-495d-afc2-96b1580f599f-kube-api-access-6754k\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:55 crc kubenswrapper[4717]: I1007 13:56:55.206288 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "144e34a6-69de-495d-afc2-96b1580f599f" (UID: "144e34a6-69de-495d-afc2-96b1580f599f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:56:55 crc kubenswrapper[4717]: I1007 13:56:55.239167 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/144e34a6-69de-495d-afc2-96b1580f599f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:56:55 crc kubenswrapper[4717]: I1007 13:56:55.641953 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gqgw" event={"ID":"144e34a6-69de-495d-afc2-96b1580f599f","Type":"ContainerDied","Data":"3f4059f035bf2ec83183150667443611dd76ab50cf9d3bc8a81163d4480d0e51"} Oct 07 13:56:55 crc kubenswrapper[4717]: I1007 13:56:55.642025 4717 scope.go:117] "RemoveContainer" containerID="2e6f3f9c1ad43bdf4c1618259e063e9b3c140096c135ea9c3bf730c4e9b59a91" Oct 07 13:56:55 crc kubenswrapper[4717]: I1007 13:56:55.642047 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gqgw" Oct 07 13:56:55 crc kubenswrapper[4717]: I1007 13:56:55.676068 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8gqgw"] Oct 07 13:56:55 crc kubenswrapper[4717]: I1007 13:56:55.679493 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8gqgw"] Oct 07 13:56:55 crc kubenswrapper[4717]: I1007 13:56:55.909385 4717 scope.go:117] "RemoveContainer" containerID="73d72efe6109bf3e72f1232fd404031adb4623bf464da296aef57dc394de0d48" Oct 07 13:56:56 crc kubenswrapper[4717]: I1007 13:56:56.211148 4717 scope.go:117] "RemoveContainer" containerID="8ad789fbb4f38aa935b756ff6624d1cd7395c70d31251bbc495d77664b47def7" Oct 07 13:56:56 crc kubenswrapper[4717]: I1007 13:56:56.621307 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:56:56 crc kubenswrapper[4717]: I1007 13:56:56.875361 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="144e34a6-69de-495d-afc2-96b1580f599f" path="/var/lib/kubelet/pods/144e34a6-69de-495d-afc2-96b1580f599f/volumes" Oct 07 13:56:58 crc kubenswrapper[4717]: I1007 13:56:58.659924 4717 generic.go:334] "Generic (PLEG): container finished" podID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerID="795b723e3a32610c9f83dcac0cef100a0fec2c1a747ce000e82f4aed6b381e71" exitCode=0 Oct 07 13:56:58 crc kubenswrapper[4717]: I1007 13:56:58.660311 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmhlf" event={"ID":"dd8359d4-1d07-4d74-955a-25c5471f3817","Type":"ContainerDied","Data":"795b723e3a32610c9f83dcac0cef100a0fec2c1a747ce000e82f4aed6b381e71"} Oct 07 13:56:58 crc kubenswrapper[4717]: I1007 13:56:58.674023 4717 generic.go:334] "Generic (PLEG): container finished" podID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerID="7e0efdc8fb4d359dc29b59dfe475877e7b747facb24eea6b5b376256b2ef2d78" exitCode=0 Oct 07 13:56:58 crc kubenswrapper[4717]: I1007 13:56:58.674433 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xq2zf" event={"ID":"dd94dd61-61e7-473c-baf4-60213f5cd072","Type":"ContainerDied","Data":"7e0efdc8fb4d359dc29b59dfe475877e7b747facb24eea6b5b376256b2ef2d78"} Oct 07 13:56:58 crc kubenswrapper[4717]: I1007 13:56:58.685440 4717 generic.go:334] "Generic (PLEG): container finished" podID="d08a38f6-5d00-440e-9023-5a586d206de3" containerID="0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38" exitCode=0 Oct 07 13:56:58 crc kubenswrapper[4717]: I1007 13:56:58.685495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlzcg" event={"ID":"d08a38f6-5d00-440e-9023-5a586d206de3","Type":"ContainerDied","Data":"0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38"} Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.609634 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.611057 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.611189 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.611862 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.612075 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89" gracePeriod=600 Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.711024 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xq2zf" event={"ID":"dd94dd61-61e7-473c-baf4-60213f5cd072","Type":"ContainerStarted","Data":"eaee567c7c2335c944c59a82d69e8796abd82984d21afbaf2dffb97f5708ef1d"} Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.712574 4717 generic.go:334] "Generic (PLEG): container finished" podID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerID="de1d6211843cc2177133c5563a3cdeb5c73b0729a5b0e961756b2c2595b0f18e" exitCode=0 Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.712640 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bsfw6" event={"ID":"0c0109c8-5421-4dc7-8e10-1843689bc9f9","Type":"ContainerDied","Data":"de1d6211843cc2177133c5563a3cdeb5c73b0729a5b0e961756b2c2595b0f18e"} Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.714178 4717 generic.go:334] "Generic (PLEG): container finished" podID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerID="a4271cf1ca636aaa16965ea02521754246ec09524de45c05d72a43fceb339c2a" exitCode=0 Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.714237 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrhrm" event={"ID":"0e9208ec-98a7-48f1-a52e-b7b188e35aa5","Type":"ContainerDied","Data":"a4271cf1ca636aaa16965ea02521754246ec09524de45c05d72a43fceb339c2a"} Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.726553 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmhlf" event={"ID":"dd8359d4-1d07-4d74-955a-25c5471f3817","Type":"ContainerStarted","Data":"417e1d8d83e7698ce6c9a166da624cb24d640830d81430dd81a744e60bb0b5c8"} Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.755326 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xq2zf" podStartSLOduration=2.929769973 podStartE2EDuration="53.755309946s" podCreationTimestamp="2025-10-07 13:56:08 +0000 UTC" firstStartedPulling="2025-10-07 13:56:10.205205824 +0000 UTC m=+152.033131616" lastFinishedPulling="2025-10-07 13:57:01.030745787 +0000 UTC m=+202.858671589" observedRunningTime="2025-10-07 13:57:01.734326484 +0000 UTC m=+203.562252276" watchObservedRunningTime="2025-10-07 13:57:01.755309946 +0000 UTC m=+203.583235738" Oct 07 13:57:01 crc kubenswrapper[4717]: I1007 13:57:01.785117 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gmhlf" podStartSLOduration=2.631704448 podStartE2EDuration="55.785093737s" podCreationTimestamp="2025-10-07 13:56:06 +0000 UTC" firstStartedPulling="2025-10-07 13:56:08.062338997 +0000 UTC m=+149.890264789" lastFinishedPulling="2025-10-07 13:57:01.215728296 +0000 UTC m=+203.043654078" observedRunningTime="2025-10-07 13:57:01.784255154 +0000 UTC m=+203.612180946" watchObservedRunningTime="2025-10-07 13:57:01.785093737 +0000 UTC m=+203.613019529" Oct 07 13:57:02 crc kubenswrapper[4717]: I1007 13:57:02.732151 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlzcg" event={"ID":"d08a38f6-5d00-440e-9023-5a586d206de3","Type":"ContainerStarted","Data":"d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40"} Oct 07 13:57:02 crc kubenswrapper[4717]: I1007 13:57:02.740132 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89" exitCode=0 Oct 07 13:57:02 crc kubenswrapper[4717]: I1007 13:57:02.740233 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89"} Oct 07 13:57:02 crc kubenswrapper[4717]: I1007 13:57:02.740291 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"5c301f33aad4e554568debc52a0ab3e2302d1a336901ef90ffaaf63d15ce3a1a"} Oct 07 13:57:02 crc kubenswrapper[4717]: I1007 13:57:02.741710 4717 generic.go:334] "Generic (PLEG): container finished" podID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerID="12ac2575412462e9af21db3539442bef870e8a6a573c53950fa1c4a52f2fb7b0" exitCode=0 Oct 07 13:57:02 crc kubenswrapper[4717]: I1007 13:57:02.741736 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7c4n" event={"ID":"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2","Type":"ContainerDied","Data":"12ac2575412462e9af21db3539442bef870e8a6a573c53950fa1c4a52f2fb7b0"} Oct 07 13:57:02 crc kubenswrapper[4717]: I1007 13:57:02.752933 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tlzcg" podStartSLOduration=4.25248067 podStartE2EDuration="56.752919933s" podCreationTimestamp="2025-10-07 13:56:06 +0000 UTC" firstStartedPulling="2025-10-07 13:56:09.080734919 +0000 UTC m=+150.908660711" lastFinishedPulling="2025-10-07 13:57:01.581174182 +0000 UTC m=+203.409099974" observedRunningTime="2025-10-07 13:57:02.751973577 +0000 UTC m=+204.579899389" watchObservedRunningTime="2025-10-07 13:57:02.752919933 +0000 UTC m=+204.580845715" Oct 07 13:57:03 crc kubenswrapper[4717]: I1007 13:57:03.748862 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bsfw6" event={"ID":"0c0109c8-5421-4dc7-8e10-1843689bc9f9","Type":"ContainerStarted","Data":"f4df93a45316e3ceb5d6dc9e912f587989b17d2004b354027d750a26cc89e0a6"} Oct 07 13:57:03 crc kubenswrapper[4717]: I1007 13:57:03.752717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7c4n" event={"ID":"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2","Type":"ContainerStarted","Data":"c0cd71d9fd5cb39f6eed112ebfe92552c5c2c21beee692173ea5ed8647eeb3f4"} Oct 07 13:57:03 crc kubenswrapper[4717]: I1007 13:57:03.755407 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrhrm" event={"ID":"0e9208ec-98a7-48f1-a52e-b7b188e35aa5","Type":"ContainerStarted","Data":"d2b91194a6cffb344f8fffd8f6e0d45a70e9fa39ed6f1e97ffeecb322cb0d147"} Oct 07 13:57:03 crc kubenswrapper[4717]: I1007 13:57:03.766756 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bsfw6" podStartSLOduration=2.926562687 podStartE2EDuration="57.766735521s" podCreationTimestamp="2025-10-07 13:56:06 +0000 UTC" firstStartedPulling="2025-10-07 13:56:08.022613652 +0000 UTC m=+149.850539444" lastFinishedPulling="2025-10-07 13:57:02.862786486 +0000 UTC m=+204.690712278" observedRunningTime="2025-10-07 13:57:03.765643561 +0000 UTC m=+205.593569353" watchObservedRunningTime="2025-10-07 13:57:03.766735521 +0000 UTC m=+205.594661313" Oct 07 13:57:03 crc kubenswrapper[4717]: I1007 13:57:03.807347 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rrhrm" podStartSLOduration=1.7579902779999999 podStartE2EDuration="55.807328157s" podCreationTimestamp="2025-10-07 13:56:08 +0000 UTC" firstStartedPulling="2025-10-07 13:56:09.084382367 +0000 UTC m=+150.912308149" lastFinishedPulling="2025-10-07 13:57:03.133720236 +0000 UTC m=+204.961646028" observedRunningTime="2025-10-07 13:57:03.804545041 +0000 UTC m=+205.632470863" watchObservedRunningTime="2025-10-07 13:57:03.807328157 +0000 UTC m=+205.635253949" Oct 07 13:57:03 crc kubenswrapper[4717]: I1007 13:57:03.823662 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p7c4n" podStartSLOduration=2.80850635 podStartE2EDuration="54.823644231s" podCreationTimestamp="2025-10-07 13:56:09 +0000 UTC" firstStartedPulling="2025-10-07 13:56:11.222562058 +0000 UTC m=+153.050487850" lastFinishedPulling="2025-10-07 13:57:03.237699939 +0000 UTC m=+205.065625731" observedRunningTime="2025-10-07 13:57:03.820322681 +0000 UTC m=+205.648248493" watchObservedRunningTime="2025-10-07 13:57:03.823644231 +0000 UTC m=+205.651570013" Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.412509 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.414767 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.460914 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.764773 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4l69q"] Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.783577 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.783631 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.848517 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.888186 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.952764 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.953132 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:57:06 crc kubenswrapper[4717]: I1007 13:57:06.998539 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:57:07 crc kubenswrapper[4717]: I1007 13:57:07.813303 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:57:08 crc kubenswrapper[4717]: I1007 13:57:08.563402 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:57:08 crc kubenswrapper[4717]: I1007 13:57:08.563472 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:57:08 crc kubenswrapper[4717]: I1007 13:57:08.599264 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:57:08 crc kubenswrapper[4717]: I1007 13:57:08.821207 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:57:08 crc kubenswrapper[4717]: I1007 13:57:08.946918 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:57:08 crc kubenswrapper[4717]: I1007 13:57:08.946969 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:57:08 crc kubenswrapper[4717]: I1007 13:57:08.983262 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:57:09 crc kubenswrapper[4717]: I1007 13:57:09.607386 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:57:09 crc kubenswrapper[4717]: I1007 13:57:09.607440 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:57:09 crc kubenswrapper[4717]: I1007 13:57:09.645880 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:57:09 crc kubenswrapper[4717]: I1007 13:57:09.822204 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:57:09 crc kubenswrapper[4717]: I1007 13:57:09.828797 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.355679 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tlzcg"] Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.355966 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tlzcg" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" containerName="registry-server" containerID="cri-o://d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40" gracePeriod=2 Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.682921 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.796836 4717 generic.go:334] "Generic (PLEG): container finished" podID="d08a38f6-5d00-440e-9023-5a586d206de3" containerID="d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40" exitCode=0 Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.796878 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlzcg" event={"ID":"d08a38f6-5d00-440e-9023-5a586d206de3","Type":"ContainerDied","Data":"d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40"} Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.796905 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tlzcg" event={"ID":"d08a38f6-5d00-440e-9023-5a586d206de3","Type":"ContainerDied","Data":"9b877c1ec0ef1de97f2f0ef61e82e427abd798c8ae24d3f54b73e85e3e81c1c3"} Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.796922 4717 scope.go:117] "RemoveContainer" containerID="d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.796943 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tlzcg" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.817480 4717 scope.go:117] "RemoveContainer" containerID="0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.839878 4717 scope.go:117] "RemoveContainer" containerID="ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.840214 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-utilities\") pod \"d08a38f6-5d00-440e-9023-5a586d206de3\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.840269 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-catalog-content\") pod \"d08a38f6-5d00-440e-9023-5a586d206de3\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.840307 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfmfg\" (UniqueName: \"kubernetes.io/projected/d08a38f6-5d00-440e-9023-5a586d206de3-kube-api-access-hfmfg\") pod \"d08a38f6-5d00-440e-9023-5a586d206de3\" (UID: \"d08a38f6-5d00-440e-9023-5a586d206de3\") " Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.841467 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-utilities" (OuterVolumeSpecName: "utilities") pod "d08a38f6-5d00-440e-9023-5a586d206de3" (UID: "d08a38f6-5d00-440e-9023-5a586d206de3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.847129 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08a38f6-5d00-440e-9023-5a586d206de3-kube-api-access-hfmfg" (OuterVolumeSpecName: "kube-api-access-hfmfg") pod "d08a38f6-5d00-440e-9023-5a586d206de3" (UID: "d08a38f6-5d00-440e-9023-5a586d206de3"). InnerVolumeSpecName "kube-api-access-hfmfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.871302 4717 scope.go:117] "RemoveContainer" containerID="d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40" Oct 07 13:57:11 crc kubenswrapper[4717]: E1007 13:57:11.871742 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40\": container with ID starting with d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40 not found: ID does not exist" containerID="d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.871791 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40"} err="failed to get container status \"d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40\": rpc error: code = NotFound desc = could not find container \"d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40\": container with ID starting with d8104fe06732ad321d0049b90794ec63ce0a56032d409696f735459c7b348d40 not found: ID does not exist" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.871826 4717 scope.go:117] "RemoveContainer" containerID="0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38" Oct 07 13:57:11 crc kubenswrapper[4717]: E1007 13:57:11.872403 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38\": container with ID starting with 0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38 not found: ID does not exist" containerID="0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.872438 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38"} err="failed to get container status \"0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38\": rpc error: code = NotFound desc = could not find container \"0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38\": container with ID starting with 0ecf81ff4bbacf70bae7a936ce50239fc8ef105b8fb953fa94de1ada0a9d2d38 not found: ID does not exist" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.872462 4717 scope.go:117] "RemoveContainer" containerID="ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec" Oct 07 13:57:11 crc kubenswrapper[4717]: E1007 13:57:11.872771 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec\": container with ID starting with ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec not found: ID does not exist" containerID="ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.872796 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec"} err="failed to get container status \"ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec\": rpc error: code = NotFound desc = could not find container \"ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec\": container with ID starting with ed293999c2955f4cd5d007950b0cc55bb38f780623a8018b4c1ab47745f180ec not found: ID does not exist" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.902599 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d08a38f6-5d00-440e-9023-5a586d206de3" (UID: "d08a38f6-5d00-440e-9023-5a586d206de3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.941602 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.941636 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d08a38f6-5d00-440e-9023-5a586d206de3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:11 crc kubenswrapper[4717]: I1007 13:57:11.941647 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfmfg\" (UniqueName: \"kubernetes.io/projected/d08a38f6-5d00-440e-9023-5a586d206de3-kube-api-access-hfmfg\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:12 crc kubenswrapper[4717]: I1007 13:57:12.120704 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tlzcg"] Oct 07 13:57:12 crc kubenswrapper[4717]: I1007 13:57:12.125060 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tlzcg"] Oct 07 13:57:12 crc kubenswrapper[4717]: I1007 13:57:12.875900 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" path="/var/lib/kubelet/pods/d08a38f6-5d00-440e-9023-5a586d206de3/volumes" Oct 07 13:57:13 crc kubenswrapper[4717]: I1007 13:57:13.155593 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xq2zf"] Oct 07 13:57:13 crc kubenswrapper[4717]: I1007 13:57:13.155831 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xq2zf" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerName="registry-server" containerID="cri-o://eaee567c7c2335c944c59a82d69e8796abd82984d21afbaf2dffb97f5708ef1d" gracePeriod=2 Oct 07 13:57:13 crc kubenswrapper[4717]: I1007 13:57:13.810220 4717 generic.go:334] "Generic (PLEG): container finished" podID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerID="eaee567c7c2335c944c59a82d69e8796abd82984d21afbaf2dffb97f5708ef1d" exitCode=0 Oct 07 13:57:13 crc kubenswrapper[4717]: I1007 13:57:13.810272 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xq2zf" event={"ID":"dd94dd61-61e7-473c-baf4-60213f5cd072","Type":"ContainerDied","Data":"eaee567c7c2335c944c59a82d69e8796abd82984d21afbaf2dffb97f5708ef1d"} Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.237598 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.375952 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-catalog-content\") pod \"dd94dd61-61e7-473c-baf4-60213f5cd072\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.376017 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nt7s\" (UniqueName: \"kubernetes.io/projected/dd94dd61-61e7-473c-baf4-60213f5cd072-kube-api-access-8nt7s\") pod \"dd94dd61-61e7-473c-baf4-60213f5cd072\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.376121 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-utilities\") pod \"dd94dd61-61e7-473c-baf4-60213f5cd072\" (UID: \"dd94dd61-61e7-473c-baf4-60213f5cd072\") " Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.376903 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-utilities" (OuterVolumeSpecName: "utilities") pod "dd94dd61-61e7-473c-baf4-60213f5cd072" (UID: "dd94dd61-61e7-473c-baf4-60213f5cd072"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.380869 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd94dd61-61e7-473c-baf4-60213f5cd072-kube-api-access-8nt7s" (OuterVolumeSpecName: "kube-api-access-8nt7s") pod "dd94dd61-61e7-473c-baf4-60213f5cd072" (UID: "dd94dd61-61e7-473c-baf4-60213f5cd072"). InnerVolumeSpecName "kube-api-access-8nt7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.396879 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd94dd61-61e7-473c-baf4-60213f5cd072" (UID: "dd94dd61-61e7-473c-baf4-60213f5cd072"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.477498 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.477579 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nt7s\" (UniqueName: \"kubernetes.io/projected/dd94dd61-61e7-473c-baf4-60213f5cd072-kube-api-access-8nt7s\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.477594 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd94dd61-61e7-473c-baf4-60213f5cd072-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.816894 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xq2zf" event={"ID":"dd94dd61-61e7-473c-baf4-60213f5cd072","Type":"ContainerDied","Data":"5be3b509387a30095ba8e37c3247d3267bc9499c75dad07acad5a9a1e062a8e9"} Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.816969 4717 scope.go:117] "RemoveContainer" containerID="eaee567c7c2335c944c59a82d69e8796abd82984d21afbaf2dffb97f5708ef1d" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.816983 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xq2zf" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.831999 4717 scope.go:117] "RemoveContainer" containerID="7e0efdc8fb4d359dc29b59dfe475877e7b747facb24eea6b5b376256b2ef2d78" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.856908 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xq2zf"] Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.859462 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xq2zf"] Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.863458 4717 scope.go:117] "RemoveContainer" containerID="245da47b479c2306a320ef7e82841b8f9afe690e6606254998daee54adc7d0bb" Oct 07 13:57:14 crc kubenswrapper[4717]: I1007 13:57:14.878405 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" path="/var/lib/kubelet/pods/dd94dd61-61e7-473c-baf4-60213f5cd072/volumes" Oct 07 13:57:16 crc kubenswrapper[4717]: I1007 13:57:16.817579 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.551525 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bsfw6"] Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.551772 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bsfw6" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerName="registry-server" containerID="cri-o://f4df93a45316e3ceb5d6dc9e912f587989b17d2004b354027d750a26cc89e0a6" gracePeriod=2 Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.835496 4717 generic.go:334] "Generic (PLEG): container finished" podID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerID="f4df93a45316e3ceb5d6dc9e912f587989b17d2004b354027d750a26cc89e0a6" exitCode=0 Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.835590 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bsfw6" event={"ID":"0c0109c8-5421-4dc7-8e10-1843689bc9f9","Type":"ContainerDied","Data":"f4df93a45316e3ceb5d6dc9e912f587989b17d2004b354027d750a26cc89e0a6"} Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.835786 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bsfw6" event={"ID":"0c0109c8-5421-4dc7-8e10-1843689bc9f9","Type":"ContainerDied","Data":"ee8c3349c8956d5c2962c225aec9e14446a8c925096924a92c4f4a4dbd620b20"} Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.835802 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee8c3349c8956d5c2962c225aec9e14446a8c925096924a92c4f4a4dbd620b20" Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.859488 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.923467 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjjhq\" (UniqueName: \"kubernetes.io/projected/0c0109c8-5421-4dc7-8e10-1843689bc9f9-kube-api-access-sjjhq\") pod \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.923601 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-utilities\") pod \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.923639 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-catalog-content\") pod \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\" (UID: \"0c0109c8-5421-4dc7-8e10-1843689bc9f9\") " Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.924644 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-utilities" (OuterVolumeSpecName: "utilities") pod "0c0109c8-5421-4dc7-8e10-1843689bc9f9" (UID: "0c0109c8-5421-4dc7-8e10-1843689bc9f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.928952 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0109c8-5421-4dc7-8e10-1843689bc9f9-kube-api-access-sjjhq" (OuterVolumeSpecName: "kube-api-access-sjjhq") pod "0c0109c8-5421-4dc7-8e10-1843689bc9f9" (UID: "0c0109c8-5421-4dc7-8e10-1843689bc9f9"). InnerVolumeSpecName "kube-api-access-sjjhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:17 crc kubenswrapper[4717]: I1007 13:57:17.974855 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c0109c8-5421-4dc7-8e10-1843689bc9f9" (UID: "0c0109c8-5421-4dc7-8e10-1843689bc9f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:18 crc kubenswrapper[4717]: I1007 13:57:18.024469 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjjhq\" (UniqueName: \"kubernetes.io/projected/0c0109c8-5421-4dc7-8e10-1843689bc9f9-kube-api-access-sjjhq\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:18 crc kubenswrapper[4717]: I1007 13:57:18.024695 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:18 crc kubenswrapper[4717]: I1007 13:57:18.024797 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c0109c8-5421-4dc7-8e10-1843689bc9f9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:18 crc kubenswrapper[4717]: I1007 13:57:18.845646 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bsfw6" Oct 07 13:57:18 crc kubenswrapper[4717]: I1007 13:57:18.887095 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bsfw6"] Oct 07 13:57:18 crc kubenswrapper[4717]: I1007 13:57:18.889165 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bsfw6"] Oct 07 13:57:20 crc kubenswrapper[4717]: I1007 13:57:20.875366 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" path="/var/lib/kubelet/pods/0c0109c8-5421-4dc7-8e10-1843689bc9f9/volumes" Oct 07 13:57:31 crc kubenswrapper[4717]: I1007 13:57:31.796890 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" podUID="88668bb1-9d3d-4761-a7a2-07a57d489243" containerName="oauth-openshift" containerID="cri-o://408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d" gracePeriod=15 Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.124033 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.153826 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6b9699fff8-54lrr"] Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154113 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerName="extract-content" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154126 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerName="extract-content" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154138 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerName="extract-content" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154144 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerName="extract-content" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154151 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" containerName="extract-utilities" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154158 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" containerName="extract-utilities" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154165 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154189 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154200 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144e34a6-69de-495d-afc2-96b1580f599f" containerName="extract-utilities" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154205 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="144e34a6-69de-495d-afc2-96b1580f599f" containerName="extract-utilities" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154213 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154218 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154225 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144e34a6-69de-495d-afc2-96b1580f599f" containerName="extract-content" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154230 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="144e34a6-69de-495d-afc2-96b1580f599f" containerName="extract-content" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154238 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" containerName="extract-content" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154244 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" containerName="extract-content" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154270 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88668bb1-9d3d-4761-a7a2-07a57d489243" containerName="oauth-openshift" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154276 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="88668bb1-9d3d-4761-a7a2-07a57d489243" containerName="oauth-openshift" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154286 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154291 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154300 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79a298ef-4a4c-43e7-973b-333fc7f0bfe4" containerName="pruner" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154306 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="79a298ef-4a4c-43e7-973b-333fc7f0bfe4" containerName="pruner" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154313 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerName="extract-utilities" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154319 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerName="extract-utilities" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154344 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="144e34a6-69de-495d-afc2-96b1580f599f" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154351 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="144e34a6-69de-495d-afc2-96b1580f599f" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.154360 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerName="extract-utilities" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154365 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerName="extract-utilities" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154470 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd94dd61-61e7-473c-baf4-60213f5cd072" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154501 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="144e34a6-69de-495d-afc2-96b1580f599f" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154511 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="79a298ef-4a4c-43e7-973b-333fc7f0bfe4" containerName="pruner" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154518 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08a38f6-5d00-440e-9023-5a586d206de3" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154527 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0109c8-5421-4dc7-8e10-1843689bc9f9" containerName="registry-server" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.154536 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="88668bb1-9d3d-4761-a7a2-07a57d489243" containerName="oauth-openshift" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.155077 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.165684 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b9699fff8-54lrr"] Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292652 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-policies\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292722 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-service-ca\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292780 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-login\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292796 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-session\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292814 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-ocp-branding-template\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292843 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-router-certs\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292857 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-error\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292878 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-serving-cert\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292916 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-provider-selection\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292939 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-cliconfig\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292955 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-dir\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.292971 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rthxr\" (UniqueName: \"kubernetes.io/projected/88668bb1-9d3d-4761-a7a2-07a57d489243-kube-api-access-rthxr\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293036 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-trusted-ca-bundle\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293056 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-idp-0-file-data\") pod \"88668bb1-9d3d-4761-a7a2-07a57d489243\" (UID: \"88668bb1-9d3d-4761-a7a2-07a57d489243\") " Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293214 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293238 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293260 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57ce476a-9e50-47c7-81d1-a91bb922699d-audit-dir\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293288 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293302 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293318 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293336 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293354 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4whpp\" (UniqueName: \"kubernetes.io/projected/57ce476a-9e50-47c7-81d1-a91bb922699d-kube-api-access-4whpp\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293373 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293397 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293418 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-audit-policies\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293434 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293473 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293493 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293567 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.293694 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.294760 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.294814 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.295179 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.299258 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.299591 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88668bb1-9d3d-4761-a7a2-07a57d489243-kube-api-access-rthxr" (OuterVolumeSpecName: "kube-api-access-rthxr") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "kube-api-access-rthxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.299696 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.300182 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.300668 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.300903 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.301253 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.301422 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.301606 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "88668bb1-9d3d-4761-a7a2-07a57d489243" (UID: "88668bb1-9d3d-4761-a7a2-07a57d489243"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.394986 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395063 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395105 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395133 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395163 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57ce476a-9e50-47c7-81d1-a91bb922699d-audit-dir\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395196 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395217 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395238 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395258 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395280 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4whpp\" (UniqueName: \"kubernetes.io/projected/57ce476a-9e50-47c7-81d1-a91bb922699d-kube-api-access-4whpp\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395304 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395337 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395363 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-audit-policies\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395382 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395435 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395452 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395466 4717 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395477 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rthxr\" (UniqueName: \"kubernetes.io/projected/88668bb1-9d3d-4761-a7a2-07a57d489243-kube-api-access-rthxr\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395492 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395505 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395516 4717 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395528 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395542 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395555 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395567 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395579 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395591 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.395603 4717 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88668bb1-9d3d-4761-a7a2-07a57d489243-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.397921 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/57ce476a-9e50-47c7-81d1-a91bb922699d-audit-dir\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.398798 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.399243 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.399381 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.399895 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.400134 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.400367 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-template-error\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.400622 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.401076 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.401198 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-session\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.401447 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/57ce476a-9e50-47c7-81d1-a91bb922699d-audit-policies\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.401425 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-user-template-login\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.402721 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/57ce476a-9e50-47c7-81d1-a91bb922699d-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.412684 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4whpp\" (UniqueName: \"kubernetes.io/projected/57ce476a-9e50-47c7-81d1-a91bb922699d-kube-api-access-4whpp\") pod \"oauth-openshift-6b9699fff8-54lrr\" (UID: \"57ce476a-9e50-47c7-81d1-a91bb922699d\") " pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.479485 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.641557 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b9699fff8-54lrr"] Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.906083 4717 generic.go:334] "Generic (PLEG): container finished" podID="88668bb1-9d3d-4761-a7a2-07a57d489243" containerID="408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d" exitCode=0 Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.906140 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.906153 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" event={"ID":"88668bb1-9d3d-4761-a7a2-07a57d489243","Type":"ContainerDied","Data":"408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d"} Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.906179 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-4l69q" event={"ID":"88668bb1-9d3d-4761-a7a2-07a57d489243","Type":"ContainerDied","Data":"4d4ab5bfb160b26e23c3223adc6370e4b9f9caea639e676f85a9dfd28ce0b56f"} Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.906195 4717 scope.go:117] "RemoveContainer" containerID="408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.909580 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" event={"ID":"57ce476a-9e50-47c7-81d1-a91bb922699d","Type":"ContainerStarted","Data":"cb9e3daa3746fa85576c7ed1e4d80990755bdfd0a9c454689b1c6bf290c6c64d"} Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.910268 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" event={"ID":"57ce476a-9e50-47c7-81d1-a91bb922699d","Type":"ContainerStarted","Data":"a3577f49c0574f30931106bc618ec45ba39e9e7c13224dbe6c797261f6bad421"} Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.910383 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.930029 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" podStartSLOduration=26.930001185 podStartE2EDuration="26.930001185s" podCreationTimestamp="2025-10-07 13:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:57:32.927472836 +0000 UTC m=+234.755398628" watchObservedRunningTime="2025-10-07 13:57:32.930001185 +0000 UTC m=+234.757926977" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.945229 4717 scope.go:117] "RemoveContainer" containerID="408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d" Oct 07 13:57:32 crc kubenswrapper[4717]: E1007 13:57:32.945717 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d\": container with ID starting with 408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d not found: ID does not exist" containerID="408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.945745 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d"} err="failed to get container status \"408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d\": rpc error: code = NotFound desc = could not find container \"408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d\": container with ID starting with 408bc579bbe9b736b8be370d1355245cbaaf5ed7d960e0a0d43b4e0283d16b8d not found: ID does not exist" Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.956688 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4l69q"] Oct 07 13:57:32 crc kubenswrapper[4717]: I1007 13:57:32.961240 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-4l69q"] Oct 07 13:57:33 crc kubenswrapper[4717]: I1007 13:57:33.256383 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6b9699fff8-54lrr" Oct 07 13:57:34 crc kubenswrapper[4717]: I1007 13:57:34.875315 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88668bb1-9d3d-4761-a7a2-07a57d489243" path="/var/lib/kubelet/pods/88668bb1-9d3d-4761-a7a2-07a57d489243/volumes" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.766569 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxhfh"] Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.767355 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xxhfh" podUID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerName="registry-server" containerID="cri-o://7b1182f8250f7eb1afd541024b7307f190c672a05751a077da236d63ba04057a" gracePeriod=30 Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.775092 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gmhlf"] Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.775552 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gmhlf" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerName="registry-server" containerID="cri-o://417e1d8d83e7698ce6c9a166da624cb24d640830d81430dd81a744e60bb0b5c8" gracePeriod=30 Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.785315 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t9hbf"] Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.785544 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" podUID="1281b203-c580-4ebb-8c75-d6c8bafb3ca2" containerName="marketplace-operator" containerID="cri-o://af55360ba3e38799ba15c8bc160b0db06419043ac145b041c6c442391f09b983" gracePeriod=30 Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.795231 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrhrm"] Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.795738 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rrhrm" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerName="registry-server" containerID="cri-o://d2b91194a6cffb344f8fffd8f6e0d45a70e9fa39ed6f1e97ffeecb322cb0d147" gracePeriod=30 Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.807071 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nztkh"] Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.808219 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.818181 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7c4n"] Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.818548 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p7c4n" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerName="registry-server" containerID="cri-o://c0cd71d9fd5cb39f6eed112ebfe92552c5c2c21beee692173ea5ed8647eeb3f4" gracePeriod=30 Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.832182 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nztkh"] Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.864468 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56aec528-e156-45c5-ac1a-d55cc129894c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nztkh\" (UID: \"56aec528-e156-45c5-ac1a-d55cc129894c\") " pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.864739 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/56aec528-e156-45c5-ac1a-d55cc129894c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nztkh\" (UID: \"56aec528-e156-45c5-ac1a-d55cc129894c\") " pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.864918 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ncdp\" (UniqueName: \"kubernetes.io/projected/56aec528-e156-45c5-ac1a-d55cc129894c-kube-api-access-6ncdp\") pod \"marketplace-operator-79b997595-nztkh\" (UID: \"56aec528-e156-45c5-ac1a-d55cc129894c\") " pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.965747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56aec528-e156-45c5-ac1a-d55cc129894c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nztkh\" (UID: \"56aec528-e156-45c5-ac1a-d55cc129894c\") " pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.966109 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/56aec528-e156-45c5-ac1a-d55cc129894c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nztkh\" (UID: \"56aec528-e156-45c5-ac1a-d55cc129894c\") " pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.966177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ncdp\" (UniqueName: \"kubernetes.io/projected/56aec528-e156-45c5-ac1a-d55cc129894c-kube-api-access-6ncdp\") pod \"marketplace-operator-79b997595-nztkh\" (UID: \"56aec528-e156-45c5-ac1a-d55cc129894c\") " pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.967424 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/56aec528-e156-45c5-ac1a-d55cc129894c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nztkh\" (UID: \"56aec528-e156-45c5-ac1a-d55cc129894c\") " pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.977827 4717 generic.go:334] "Generic (PLEG): container finished" podID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerID="d2b91194a6cffb344f8fffd8f6e0d45a70e9fa39ed6f1e97ffeecb322cb0d147" exitCode=0 Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.977910 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrhrm" event={"ID":"0e9208ec-98a7-48f1-a52e-b7b188e35aa5","Type":"ContainerDied","Data":"d2b91194a6cffb344f8fffd8f6e0d45a70e9fa39ed6f1e97ffeecb322cb0d147"} Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.982452 4717 generic.go:334] "Generic (PLEG): container finished" podID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerID="417e1d8d83e7698ce6c9a166da624cb24d640830d81430dd81a744e60bb0b5c8" exitCode=0 Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.982556 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmhlf" event={"ID":"dd8359d4-1d07-4d74-955a-25c5471f3817","Type":"ContainerDied","Data":"417e1d8d83e7698ce6c9a166da624cb24d640830d81430dd81a744e60bb0b5c8"} Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.985493 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/56aec528-e156-45c5-ac1a-d55cc129894c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nztkh\" (UID: \"56aec528-e156-45c5-ac1a-d55cc129894c\") " pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.987173 4717 generic.go:334] "Generic (PLEG): container finished" podID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerID="7b1182f8250f7eb1afd541024b7307f190c672a05751a077da236d63ba04057a" exitCode=0 Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.987214 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhfh" event={"ID":"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb","Type":"ContainerDied","Data":"7b1182f8250f7eb1afd541024b7307f190c672a05751a077da236d63ba04057a"} Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.987932 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ncdp\" (UniqueName: \"kubernetes.io/projected/56aec528-e156-45c5-ac1a-d55cc129894c-kube-api-access-6ncdp\") pod \"marketplace-operator-79b997595-nztkh\" (UID: \"56aec528-e156-45c5-ac1a-d55cc129894c\") " pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.989250 4717 generic.go:334] "Generic (PLEG): container finished" podID="1281b203-c580-4ebb-8c75-d6c8bafb3ca2" containerID="af55360ba3e38799ba15c8bc160b0db06419043ac145b041c6c442391f09b983" exitCode=0 Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.989322 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" event={"ID":"1281b203-c580-4ebb-8c75-d6c8bafb3ca2","Type":"ContainerDied","Data":"af55360ba3e38799ba15c8bc160b0db06419043ac145b041c6c442391f09b983"} Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.992129 4717 generic.go:334] "Generic (PLEG): container finished" podID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerID="c0cd71d9fd5cb39f6eed112ebfe92552c5c2c21beee692173ea5ed8647eeb3f4" exitCode=0 Oct 07 13:57:44 crc kubenswrapper[4717]: I1007 13:57:44.992160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7c4n" event={"ID":"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2","Type":"ContainerDied","Data":"c0cd71d9fd5cb39f6eed112ebfe92552c5c2c21beee692173ea5ed8647eeb3f4"} Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.129657 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.275111 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.282961 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.284562 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.288111 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.291172 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374112 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-catalog-content\") pod \"dd8359d4-1d07-4d74-955a-25c5471f3817\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374165 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-utilities\") pod \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374194 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-catalog-content\") pod \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374223 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-trusted-ca\") pod \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374278 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-utilities\") pod \"dd8359d4-1d07-4d74-955a-25c5471f3817\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374308 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-utilities\") pod \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374338 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-operator-metrics\") pod \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374363 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzb9z\" (UniqueName: \"kubernetes.io/projected/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-kube-api-access-bzb9z\") pod \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374396 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-catalog-content\") pod \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374465 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-utilities\") pod \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\" (UID: \"0e9208ec-98a7-48f1-a52e-b7b188e35aa5\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374493 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2sn6\" (UniqueName: \"kubernetes.io/projected/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-kube-api-access-r2sn6\") pod \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\" (UID: \"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374515 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llz6j\" (UniqueName: \"kubernetes.io/projected/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-kube-api-access-llz6j\") pod \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374551 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n44b7\" (UniqueName: \"kubernetes.io/projected/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-kube-api-access-n44b7\") pod \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\" (UID: \"1281b203-c580-4ebb-8c75-d6c8bafb3ca2\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374572 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-catalog-content\") pod \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\" (UID: \"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.374608 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nljwq\" (UniqueName: \"kubernetes.io/projected/dd8359d4-1d07-4d74-955a-25c5471f3817-kube-api-access-nljwq\") pod \"dd8359d4-1d07-4d74-955a-25c5471f3817\" (UID: \"dd8359d4-1d07-4d74-955a-25c5471f3817\") " Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.375310 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-utilities" (OuterVolumeSpecName: "utilities") pod "c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" (UID: "c55f6501-b4f7-4bad-814c-83c1b3c8b5cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.375903 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-utilities" (OuterVolumeSpecName: "utilities") pod "dd8359d4-1d07-4d74-955a-25c5471f3817" (UID: "dd8359d4-1d07-4d74-955a-25c5471f3817"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.376116 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-utilities" (OuterVolumeSpecName: "utilities") pod "b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" (UID: "b9a5adf2-d75a-4f7f-94a6-55cdbc781de2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.377180 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-utilities" (OuterVolumeSpecName: "utilities") pod "0e9208ec-98a7-48f1-a52e-b7b188e35aa5" (UID: "0e9208ec-98a7-48f1-a52e-b7b188e35aa5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.377300 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1281b203-c580-4ebb-8c75-d6c8bafb3ca2" (UID: "1281b203-c580-4ebb-8c75-d6c8bafb3ca2"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.379264 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-kube-api-access-r2sn6" (OuterVolumeSpecName: "kube-api-access-r2sn6") pod "c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" (UID: "c55f6501-b4f7-4bad-814c-83c1b3c8b5cb"). InnerVolumeSpecName "kube-api-access-r2sn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.379702 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-kube-api-access-n44b7" (OuterVolumeSpecName: "kube-api-access-n44b7") pod "1281b203-c580-4ebb-8c75-d6c8bafb3ca2" (UID: "1281b203-c580-4ebb-8c75-d6c8bafb3ca2"). InnerVolumeSpecName "kube-api-access-n44b7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.380771 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-kube-api-access-llz6j" (OuterVolumeSpecName: "kube-api-access-llz6j") pod "b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" (UID: "b9a5adf2-d75a-4f7f-94a6-55cdbc781de2"). InnerVolumeSpecName "kube-api-access-llz6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.389458 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1281b203-c580-4ebb-8c75-d6c8bafb3ca2" (UID: "1281b203-c580-4ebb-8c75-d6c8bafb3ca2"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.392057 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8359d4-1d07-4d74-955a-25c5471f3817-kube-api-access-nljwq" (OuterVolumeSpecName: "kube-api-access-nljwq") pod "dd8359d4-1d07-4d74-955a-25c5471f3817" (UID: "dd8359d4-1d07-4d74-955a-25c5471f3817"). InnerVolumeSpecName "kube-api-access-nljwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.392293 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-kube-api-access-bzb9z" (OuterVolumeSpecName: "kube-api-access-bzb9z") pod "0e9208ec-98a7-48f1-a52e-b7b188e35aa5" (UID: "0e9208ec-98a7-48f1-a52e-b7b188e35aa5"). InnerVolumeSpecName "kube-api-access-bzb9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.398803 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e9208ec-98a7-48f1-a52e-b7b188e35aa5" (UID: "0e9208ec-98a7-48f1-a52e-b7b188e35aa5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.435422 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" (UID: "c55f6501-b4f7-4bad-814c-83c1b3c8b5cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.450697 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd8359d4-1d07-4d74-955a-25c5471f3817" (UID: "dd8359d4-1d07-4d74-955a-25c5471f3817"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.458181 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" (UID: "b9a5adf2-d75a-4f7f-94a6-55cdbc781de2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476307 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476339 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476347 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476358 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476369 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd8359d4-1d07-4d74-955a-25c5471f3817-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476377 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476385 4717 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476394 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzb9z\" (UniqueName: \"kubernetes.io/projected/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-kube-api-access-bzb9z\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476402 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476410 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9208ec-98a7-48f1-a52e-b7b188e35aa5-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476418 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llz6j\" (UniqueName: \"kubernetes.io/projected/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-kube-api-access-llz6j\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476426 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2sn6\" (UniqueName: \"kubernetes.io/projected/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb-kube-api-access-r2sn6\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476435 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n44b7\" (UniqueName: \"kubernetes.io/projected/1281b203-c580-4ebb-8c75-d6c8bafb3ca2-kube-api-access-n44b7\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476442 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.476452 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nljwq\" (UniqueName: \"kubernetes.io/projected/dd8359d4-1d07-4d74-955a-25c5471f3817-kube-api-access-nljwq\") on node \"crc\" DevicePath \"\"" Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.602876 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nztkh"] Oct 07 13:57:45 crc kubenswrapper[4717]: I1007 13:57:45.999536 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" event={"ID":"56aec528-e156-45c5-ac1a-d55cc129894c","Type":"ContainerStarted","Data":"bd25892dc8aad5d92b356f2a6ce48ee815ffbb6bab213724a80adc27b2fe6ecb"} Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.000712 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" event={"ID":"56aec528-e156-45c5-ac1a-d55cc129894c","Type":"ContainerStarted","Data":"6e0d8ea7d9037c087446a3cbc5157c59993b81cf81d3b512cf2918ec1808b33a"} Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.000827 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.001753 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.002713 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xxhfh" event={"ID":"c55f6501-b4f7-4bad-814c-83c1b3c8b5cb","Type":"ContainerDied","Data":"05b0af3b86a5255a59f51f2b28f695f3ab288e7049f1c87260541cb5223426f8"} Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.002787 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xxhfh" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.003055 4717 scope.go:117] "RemoveContainer" containerID="7b1182f8250f7eb1afd541024b7307f190c672a05751a077da236d63ba04057a" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.004403 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" event={"ID":"1281b203-c580-4ebb-8c75-d6c8bafb3ca2","Type":"ContainerDied","Data":"9ce0fbc260678964cb3a325f3b78fb24a85cbc8f17bb482e90e33adc93115403"} Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.004493 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t9hbf" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.007133 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7c4n" event={"ID":"b9a5adf2-d75a-4f7f-94a6-55cdbc781de2","Type":"ContainerDied","Data":"c5f7b37a11461bd67f1bd90de330c12738b9b6da8da611a979609f5968acb062"} Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.007217 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7c4n" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.010807 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rrhrm" event={"ID":"0e9208ec-98a7-48f1-a52e-b7b188e35aa5","Type":"ContainerDied","Data":"7dd6738d4523a135d5a4cf3189be80192ae0d6990e8061d4320b04f405550079"} Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.010916 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rrhrm" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.022314 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nztkh" podStartSLOduration=2.021876447 podStartE2EDuration="2.021876447s" podCreationTimestamp="2025-10-07 13:57:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 13:57:46.016336926 +0000 UTC m=+247.844262728" watchObservedRunningTime="2025-10-07 13:57:46.021876447 +0000 UTC m=+247.849802239" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.023136 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gmhlf" event={"ID":"dd8359d4-1d07-4d74-955a-25c5471f3817","Type":"ContainerDied","Data":"2a828b1a1d7b6c08f7fce9a52ec3497a478a36a896dbdad751a207903e0a8048"} Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.023255 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gmhlf" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.023997 4717 scope.go:117] "RemoveContainer" containerID="cbdccfe89d1cb45f3ffa7aa71faf32be4a5d36b1332faa7ba8a91af275328caf" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.076173 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xxhfh"] Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.079499 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xxhfh"] Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.081791 4717 scope.go:117] "RemoveContainer" containerID="08bee7e9be5834c9b7a875ab58291d9262aa7392c37a0adb8158e2363b32ad90" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.089917 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7c4n"] Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.092986 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p7c4n"] Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.098987 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t9hbf"] Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.102088 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t9hbf"] Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.113854 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrhrm"] Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.115619 4717 scope.go:117] "RemoveContainer" containerID="af55360ba3e38799ba15c8bc160b0db06419043ac145b041c6c442391f09b983" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.119355 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rrhrm"] Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.124990 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gmhlf"] Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.128074 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gmhlf"] Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.133228 4717 scope.go:117] "RemoveContainer" containerID="c0cd71d9fd5cb39f6eed112ebfe92552c5c2c21beee692173ea5ed8647eeb3f4" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.146202 4717 scope.go:117] "RemoveContainer" containerID="12ac2575412462e9af21db3539442bef870e8a6a573c53950fa1c4a52f2fb7b0" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.159845 4717 scope.go:117] "RemoveContainer" containerID="852d82f9d7c6abafb0cfa3da43f18d8c8f17cfe3662094fe89c0e4979298cad0" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.173403 4717 scope.go:117] "RemoveContainer" containerID="d2b91194a6cffb344f8fffd8f6e0d45a70e9fa39ed6f1e97ffeecb322cb0d147" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.188636 4717 scope.go:117] "RemoveContainer" containerID="a4271cf1ca636aaa16965ea02521754246ec09524de45c05d72a43fceb339c2a" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.204990 4717 scope.go:117] "RemoveContainer" containerID="b31c229f9db5266e57340081e676b0e0dd92d4d8b9a1fa051f95a1eb40121225" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.219091 4717 scope.go:117] "RemoveContainer" containerID="417e1d8d83e7698ce6c9a166da624cb24d640830d81430dd81a744e60bb0b5c8" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.232101 4717 scope.go:117] "RemoveContainer" containerID="795b723e3a32610c9f83dcac0cef100a0fec2c1a747ce000e82f4aed6b381e71" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.251548 4717 scope.go:117] "RemoveContainer" containerID="3a9f49de65c6f063b3136e0e5b2314a4eb9e7389275f7d7cd5f388136bb07f67" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.878229 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" path="/var/lib/kubelet/pods/0e9208ec-98a7-48f1-a52e-b7b188e35aa5/volumes" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.879566 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1281b203-c580-4ebb-8c75-d6c8bafb3ca2" path="/var/lib/kubelet/pods/1281b203-c580-4ebb-8c75-d6c8bafb3ca2/volumes" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.880141 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" path="/var/lib/kubelet/pods/b9a5adf2-d75a-4f7f-94a6-55cdbc781de2/volumes" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.881271 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" path="/var/lib/kubelet/pods/c55f6501-b4f7-4bad-814c-83c1b3c8b5cb/volumes" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.881897 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" path="/var/lib/kubelet/pods/dd8359d4-1d07-4d74-955a-25c5471f3817/volumes" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980582 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8zvs9"] Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980807 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980821 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980831 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980838 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980846 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerName="extract-content" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980852 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerName="extract-content" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980864 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerName="extract-content" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980870 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerName="extract-content" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980881 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerName="extract-utilities" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980887 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerName="extract-utilities" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980894 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerName="extract-utilities" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980901 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerName="extract-utilities" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980912 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerName="extract-utilities" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980919 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerName="extract-utilities" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980927 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerName="extract-content" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980934 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerName="extract-content" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980944 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980950 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980959 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1281b203-c580-4ebb-8c75-d6c8bafb3ca2" containerName="marketplace-operator" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980964 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1281b203-c580-4ebb-8c75-d6c8bafb3ca2" containerName="marketplace-operator" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980972 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerName="extract-content" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980979 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerName="extract-content" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.980987 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerName="extract-utilities" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.980994 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerName="extract-utilities" Oct 07 13:57:46 crc kubenswrapper[4717]: E1007 13:57:46.981024 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.981032 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.981130 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9208ec-98a7-48f1-a52e-b7b188e35aa5" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.981145 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a5adf2-d75a-4f7f-94a6-55cdbc781de2" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.981151 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55f6501-b4f7-4bad-814c-83c1b3c8b5cb" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.981163 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8359d4-1d07-4d74-955a-25c5471f3817" containerName="registry-server" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.981171 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1281b203-c580-4ebb-8c75-d6c8bafb3ca2" containerName="marketplace-operator" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.981876 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.983743 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 07 13:57:46 crc kubenswrapper[4717]: I1007 13:57:46.988322 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8zvs9"] Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.093735 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb9rs\" (UniqueName: \"kubernetes.io/projected/b1fa580a-f78c-49a8-bd15-8d173925591a-kube-api-access-xb9rs\") pod \"certified-operators-8zvs9\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.093806 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-catalog-content\") pod \"certified-operators-8zvs9\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.093833 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-utilities\") pod \"certified-operators-8zvs9\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.182580 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bgtcz"] Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.183925 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.191707 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.194940 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgtcz"] Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.195764 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb9rs\" (UniqueName: \"kubernetes.io/projected/b1fa580a-f78c-49a8-bd15-8d173925591a-kube-api-access-xb9rs\") pod \"certified-operators-8zvs9\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.195909 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-catalog-content\") pod \"certified-operators-8zvs9\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.196032 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-utilities\") pod \"certified-operators-8zvs9\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.196421 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-catalog-content\") pod \"certified-operators-8zvs9\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.196879 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-utilities\") pod \"certified-operators-8zvs9\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.220138 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb9rs\" (UniqueName: \"kubernetes.io/projected/b1fa580a-f78c-49a8-bd15-8d173925591a-kube-api-access-xb9rs\") pod \"certified-operators-8zvs9\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.296931 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65715644-72b4-4b0e-895c-7d8a39fa8f87-utilities\") pod \"redhat-marketplace-bgtcz\" (UID: \"65715644-72b4-4b0e-895c-7d8a39fa8f87\") " pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.297201 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptd2\" (UniqueName: \"kubernetes.io/projected/65715644-72b4-4b0e-895c-7d8a39fa8f87-kube-api-access-7ptd2\") pod \"redhat-marketplace-bgtcz\" (UID: \"65715644-72b4-4b0e-895c-7d8a39fa8f87\") " pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.297334 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65715644-72b4-4b0e-895c-7d8a39fa8f87-catalog-content\") pod \"redhat-marketplace-bgtcz\" (UID: \"65715644-72b4-4b0e-895c-7d8a39fa8f87\") " pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.301706 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.398359 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65715644-72b4-4b0e-895c-7d8a39fa8f87-catalog-content\") pod \"redhat-marketplace-bgtcz\" (UID: \"65715644-72b4-4b0e-895c-7d8a39fa8f87\") " pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.398763 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65715644-72b4-4b0e-895c-7d8a39fa8f87-utilities\") pod \"redhat-marketplace-bgtcz\" (UID: \"65715644-72b4-4b0e-895c-7d8a39fa8f87\") " pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.398794 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptd2\" (UniqueName: \"kubernetes.io/projected/65715644-72b4-4b0e-895c-7d8a39fa8f87-kube-api-access-7ptd2\") pod \"redhat-marketplace-bgtcz\" (UID: \"65715644-72b4-4b0e-895c-7d8a39fa8f87\") " pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.399025 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65715644-72b4-4b0e-895c-7d8a39fa8f87-catalog-content\") pod \"redhat-marketplace-bgtcz\" (UID: \"65715644-72b4-4b0e-895c-7d8a39fa8f87\") " pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.399170 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65715644-72b4-4b0e-895c-7d8a39fa8f87-utilities\") pod \"redhat-marketplace-bgtcz\" (UID: \"65715644-72b4-4b0e-895c-7d8a39fa8f87\") " pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.415725 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptd2\" (UniqueName: \"kubernetes.io/projected/65715644-72b4-4b0e-895c-7d8a39fa8f87-kube-api-access-7ptd2\") pod \"redhat-marketplace-bgtcz\" (UID: \"65715644-72b4-4b0e-895c-7d8a39fa8f87\") " pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.486087 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8zvs9"] Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.507299 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:47 crc kubenswrapper[4717]: I1007 13:57:47.707744 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bgtcz"] Oct 07 13:57:47 crc kubenswrapper[4717]: W1007 13:57:47.737723 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65715644_72b4_4b0e_895c_7d8a39fa8f87.slice/crio-06e3bfd574d16696f69a8c71bccf4349002ee7058f11c410b138ba7d273f895c WatchSource:0}: Error finding container 06e3bfd574d16696f69a8c71bccf4349002ee7058f11c410b138ba7d273f895c: Status 404 returned error can't find the container with id 06e3bfd574d16696f69a8c71bccf4349002ee7058f11c410b138ba7d273f895c Oct 07 13:57:48 crc kubenswrapper[4717]: I1007 13:57:48.039538 4717 generic.go:334] "Generic (PLEG): container finished" podID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerID="03c92bbd648cbdd8c13fa3ee08821e6de09238f9ff7212825d5b29dc8053072f" exitCode=0 Oct 07 13:57:48 crc kubenswrapper[4717]: I1007 13:57:48.039633 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zvs9" event={"ID":"b1fa580a-f78c-49a8-bd15-8d173925591a","Type":"ContainerDied","Data":"03c92bbd648cbdd8c13fa3ee08821e6de09238f9ff7212825d5b29dc8053072f"} Oct 07 13:57:48 crc kubenswrapper[4717]: I1007 13:57:48.039673 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zvs9" event={"ID":"b1fa580a-f78c-49a8-bd15-8d173925591a","Type":"ContainerStarted","Data":"d69ebe0250989b5ebcfd5cece5bed9279eedf89ed5fd3d9477bd0ca456b07d55"} Oct 07 13:57:48 crc kubenswrapper[4717]: I1007 13:57:48.041500 4717 generic.go:334] "Generic (PLEG): container finished" podID="65715644-72b4-4b0e-895c-7d8a39fa8f87" containerID="7d1059ffd1b991e849aba006b3ff24b6a8651ec803bf5ccd5deec738af8290a9" exitCode=0 Oct 07 13:57:48 crc kubenswrapper[4717]: I1007 13:57:48.041556 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgtcz" event={"ID":"65715644-72b4-4b0e-895c-7d8a39fa8f87","Type":"ContainerDied","Data":"7d1059ffd1b991e849aba006b3ff24b6a8651ec803bf5ccd5deec738af8290a9"} Oct 07 13:57:48 crc kubenswrapper[4717]: I1007 13:57:48.041600 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgtcz" event={"ID":"65715644-72b4-4b0e-895c-7d8a39fa8f87","Type":"ContainerStarted","Data":"06e3bfd574d16696f69a8c71bccf4349002ee7058f11c410b138ba7d273f895c"} Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.051895 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgtcz" event={"ID":"65715644-72b4-4b0e-895c-7d8a39fa8f87","Type":"ContainerStarted","Data":"96cbc1f75f5bfbf44edb115567dd374677d935d2f8255fbdd431df811176dfba"} Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.379696 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b9pl8"] Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.381572 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.383544 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.390543 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9pl8"] Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.419740 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-utilities\") pod \"redhat-operators-b9pl8\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.420303 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqbmw\" (UniqueName: \"kubernetes.io/projected/873fb065-5d2d-48cc-b5f2-2d65764ec045-kube-api-access-qqbmw\") pod \"redhat-operators-b9pl8\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.420468 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-catalog-content\") pod \"redhat-operators-b9pl8\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.521048 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-utilities\") pod \"redhat-operators-b9pl8\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.521306 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqbmw\" (UniqueName: \"kubernetes.io/projected/873fb065-5d2d-48cc-b5f2-2d65764ec045-kube-api-access-qqbmw\") pod \"redhat-operators-b9pl8\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.521449 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-catalog-content\") pod \"redhat-operators-b9pl8\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.521643 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-utilities\") pod \"redhat-operators-b9pl8\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.521799 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-catalog-content\") pod \"redhat-operators-b9pl8\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.540172 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqbmw\" (UniqueName: \"kubernetes.io/projected/873fb065-5d2d-48cc-b5f2-2d65764ec045-kube-api-access-qqbmw\") pod \"redhat-operators-b9pl8\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.584132 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qxh5w"] Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.588177 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.590302 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.591316 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxh5w"] Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.622355 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6sk8\" (UniqueName: \"kubernetes.io/projected/5a62f706-d9c9-407c-ba19-8556dfa331f4-kube-api-access-d6sk8\") pod \"community-operators-qxh5w\" (UID: \"5a62f706-d9c9-407c-ba19-8556dfa331f4\") " pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.622427 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a62f706-d9c9-407c-ba19-8556dfa331f4-catalog-content\") pod \"community-operators-qxh5w\" (UID: \"5a62f706-d9c9-407c-ba19-8556dfa331f4\") " pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.622458 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a62f706-d9c9-407c-ba19-8556dfa331f4-utilities\") pod \"community-operators-qxh5w\" (UID: \"5a62f706-d9c9-407c-ba19-8556dfa331f4\") " pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.710330 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.724717 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6sk8\" (UniqueName: \"kubernetes.io/projected/5a62f706-d9c9-407c-ba19-8556dfa331f4-kube-api-access-d6sk8\") pod \"community-operators-qxh5w\" (UID: \"5a62f706-d9c9-407c-ba19-8556dfa331f4\") " pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.725185 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a62f706-d9c9-407c-ba19-8556dfa331f4-catalog-content\") pod \"community-operators-qxh5w\" (UID: \"5a62f706-d9c9-407c-ba19-8556dfa331f4\") " pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.725325 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a62f706-d9c9-407c-ba19-8556dfa331f4-utilities\") pod \"community-operators-qxh5w\" (UID: \"5a62f706-d9c9-407c-ba19-8556dfa331f4\") " pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.725589 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a62f706-d9c9-407c-ba19-8556dfa331f4-catalog-content\") pod \"community-operators-qxh5w\" (UID: \"5a62f706-d9c9-407c-ba19-8556dfa331f4\") " pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.725703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a62f706-d9c9-407c-ba19-8556dfa331f4-utilities\") pod \"community-operators-qxh5w\" (UID: \"5a62f706-d9c9-407c-ba19-8556dfa331f4\") " pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.740575 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6sk8\" (UniqueName: \"kubernetes.io/projected/5a62f706-d9c9-407c-ba19-8556dfa331f4-kube-api-access-d6sk8\") pod \"community-operators-qxh5w\" (UID: \"5a62f706-d9c9-407c-ba19-8556dfa331f4\") " pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:49 crc kubenswrapper[4717]: I1007 13:57:49.882651 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b9pl8"] Oct 07 13:57:49 crc kubenswrapper[4717]: W1007 13:57:49.891812 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod873fb065_5d2d_48cc_b5f2_2d65764ec045.slice/crio-7aa66ab5a644fea132e6b5fdecb190dd843be8dff6a3cc5353fdf8cecb2f28dd WatchSource:0}: Error finding container 7aa66ab5a644fea132e6b5fdecb190dd843be8dff6a3cc5353fdf8cecb2f28dd: Status 404 returned error can't find the container with id 7aa66ab5a644fea132e6b5fdecb190dd843be8dff6a3cc5353fdf8cecb2f28dd Oct 07 13:57:50 crc kubenswrapper[4717]: I1007 13:57:50.007773 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:57:50 crc kubenswrapper[4717]: I1007 13:57:50.062270 4717 generic.go:334] "Generic (PLEG): container finished" podID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerID="c8a229e9b02ee80ad85d00a64eee03a1ba60aa17c2b68d8ffb39becb84e7460c" exitCode=0 Oct 07 13:57:50 crc kubenswrapper[4717]: I1007 13:57:50.062358 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zvs9" event={"ID":"b1fa580a-f78c-49a8-bd15-8d173925591a","Type":"ContainerDied","Data":"c8a229e9b02ee80ad85d00a64eee03a1ba60aa17c2b68d8ffb39becb84e7460c"} Oct 07 13:57:50 crc kubenswrapper[4717]: I1007 13:57:50.068936 4717 generic.go:334] "Generic (PLEG): container finished" podID="65715644-72b4-4b0e-895c-7d8a39fa8f87" containerID="96cbc1f75f5bfbf44edb115567dd374677d935d2f8255fbdd431df811176dfba" exitCode=0 Oct 07 13:57:50 crc kubenswrapper[4717]: I1007 13:57:50.068986 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgtcz" event={"ID":"65715644-72b4-4b0e-895c-7d8a39fa8f87","Type":"ContainerDied","Data":"96cbc1f75f5bfbf44edb115567dd374677d935d2f8255fbdd431df811176dfba"} Oct 07 13:57:50 crc kubenswrapper[4717]: I1007 13:57:50.071306 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9pl8" event={"ID":"873fb065-5d2d-48cc-b5f2-2d65764ec045","Type":"ContainerStarted","Data":"36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68"} Oct 07 13:57:50 crc kubenswrapper[4717]: I1007 13:57:50.071336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9pl8" event={"ID":"873fb065-5d2d-48cc-b5f2-2d65764ec045","Type":"ContainerStarted","Data":"7aa66ab5a644fea132e6b5fdecb190dd843be8dff6a3cc5353fdf8cecb2f28dd"} Oct 07 13:57:50 crc kubenswrapper[4717]: I1007 13:57:50.198213 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxh5w"] Oct 07 13:57:51 crc kubenswrapper[4717]: I1007 13:57:51.094104 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zvs9" event={"ID":"b1fa580a-f78c-49a8-bd15-8d173925591a","Type":"ContainerStarted","Data":"9dbb21310dbe464a795cead66c96456169fab09665a9aa35322c48a4c783af15"} Oct 07 13:57:51 crc kubenswrapper[4717]: I1007 13:57:51.096909 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bgtcz" event={"ID":"65715644-72b4-4b0e-895c-7d8a39fa8f87","Type":"ContainerStarted","Data":"301a05abc5d8a449fd0ed97e8ba083464ea093f1f68ac3cd921a294c89756585"} Oct 07 13:57:51 crc kubenswrapper[4717]: I1007 13:57:51.098380 4717 generic.go:334] "Generic (PLEG): container finished" podID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerID="36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68" exitCode=0 Oct 07 13:57:51 crc kubenswrapper[4717]: I1007 13:57:51.098449 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9pl8" event={"ID":"873fb065-5d2d-48cc-b5f2-2d65764ec045","Type":"ContainerDied","Data":"36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68"} Oct 07 13:57:51 crc kubenswrapper[4717]: I1007 13:57:51.099952 4717 generic.go:334] "Generic (PLEG): container finished" podID="5a62f706-d9c9-407c-ba19-8556dfa331f4" containerID="eb87818945b80be6d93136dc6c501c1759b36cda377294ee28c414f091150e74" exitCode=0 Oct 07 13:57:51 crc kubenswrapper[4717]: I1007 13:57:51.100001 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxh5w" event={"ID":"5a62f706-d9c9-407c-ba19-8556dfa331f4","Type":"ContainerDied","Data":"eb87818945b80be6d93136dc6c501c1759b36cda377294ee28c414f091150e74"} Oct 07 13:57:51 crc kubenswrapper[4717]: I1007 13:57:51.100068 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxh5w" event={"ID":"5a62f706-d9c9-407c-ba19-8556dfa331f4","Type":"ContainerStarted","Data":"6a39bda6b812b331f89c56f8cca395f6b001c4a3f0a32e2658c511e5298bce6a"} Oct 07 13:57:51 crc kubenswrapper[4717]: I1007 13:57:51.137452 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8zvs9" podStartSLOduration=2.632733284 podStartE2EDuration="5.137435026s" podCreationTimestamp="2025-10-07 13:57:46 +0000 UTC" firstStartedPulling="2025-10-07 13:57:48.040761827 +0000 UTC m=+249.868687619" lastFinishedPulling="2025-10-07 13:57:50.545463579 +0000 UTC m=+252.373389361" observedRunningTime="2025-10-07 13:57:51.115121208 +0000 UTC m=+252.943047010" watchObservedRunningTime="2025-10-07 13:57:51.137435026 +0000 UTC m=+252.965360818" Oct 07 13:57:51 crc kubenswrapper[4717]: I1007 13:57:51.158615 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bgtcz" podStartSLOduration=1.5457041820000001 podStartE2EDuration="4.158537811s" podCreationTimestamp="2025-10-07 13:57:47 +0000 UTC" firstStartedPulling="2025-10-07 13:57:48.045565987 +0000 UTC m=+249.873491779" lastFinishedPulling="2025-10-07 13:57:50.658399616 +0000 UTC m=+252.486325408" observedRunningTime="2025-10-07 13:57:51.139485622 +0000 UTC m=+252.967411424" watchObservedRunningTime="2025-10-07 13:57:51.158537811 +0000 UTC m=+252.986463603" Oct 07 13:57:52 crc kubenswrapper[4717]: I1007 13:57:52.106102 4717 generic.go:334] "Generic (PLEG): container finished" podID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerID="6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5" exitCode=0 Oct 07 13:57:52 crc kubenswrapper[4717]: I1007 13:57:52.106300 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9pl8" event={"ID":"873fb065-5d2d-48cc-b5f2-2d65764ec045","Type":"ContainerDied","Data":"6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5"} Oct 07 13:57:54 crc kubenswrapper[4717]: I1007 13:57:54.119750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9pl8" event={"ID":"873fb065-5d2d-48cc-b5f2-2d65764ec045","Type":"ContainerStarted","Data":"e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e"} Oct 07 13:57:54 crc kubenswrapper[4717]: I1007 13:57:54.122839 4717 generic.go:334] "Generic (PLEG): container finished" podID="5a62f706-d9c9-407c-ba19-8556dfa331f4" containerID="b6e618ed8865a9047e5518fa6f684872abc299fdad980a012af4a60877eb4f83" exitCode=0 Oct 07 13:57:54 crc kubenswrapper[4717]: I1007 13:57:54.122898 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxh5w" event={"ID":"5a62f706-d9c9-407c-ba19-8556dfa331f4","Type":"ContainerDied","Data":"b6e618ed8865a9047e5518fa6f684872abc299fdad980a012af4a60877eb4f83"} Oct 07 13:57:54 crc kubenswrapper[4717]: I1007 13:57:54.137738 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b9pl8" podStartSLOduration=2.676652766 podStartE2EDuration="5.137718591s" podCreationTimestamp="2025-10-07 13:57:49 +0000 UTC" firstStartedPulling="2025-10-07 13:57:50.072471895 +0000 UTC m=+251.900397697" lastFinishedPulling="2025-10-07 13:57:52.53353773 +0000 UTC m=+254.361463522" observedRunningTime="2025-10-07 13:57:54.135352196 +0000 UTC m=+255.963277988" watchObservedRunningTime="2025-10-07 13:57:54.137718591 +0000 UTC m=+255.965644383" Oct 07 13:57:55 crc kubenswrapper[4717]: I1007 13:57:55.133081 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxh5w" event={"ID":"5a62f706-d9c9-407c-ba19-8556dfa331f4","Type":"ContainerStarted","Data":"8eb99bc9e651dba9ab5d10833439ae459e21ad6d9d48a016a14c893c152fe85f"} Oct 07 13:57:55 crc kubenswrapper[4717]: I1007 13:57:55.148912 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qxh5w" podStartSLOduration=2.64410563 podStartE2EDuration="6.148894268s" podCreationTimestamp="2025-10-07 13:57:49 +0000 UTC" firstStartedPulling="2025-10-07 13:57:51.101628201 +0000 UTC m=+252.929553993" lastFinishedPulling="2025-10-07 13:57:54.606416839 +0000 UTC m=+256.434342631" observedRunningTime="2025-10-07 13:57:55.148578099 +0000 UTC m=+256.976503891" watchObservedRunningTime="2025-10-07 13:57:55.148894268 +0000 UTC m=+256.976820060" Oct 07 13:57:57 crc kubenswrapper[4717]: I1007 13:57:57.302469 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:57 crc kubenswrapper[4717]: I1007 13:57:57.302783 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:57 crc kubenswrapper[4717]: I1007 13:57:57.342885 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:57 crc kubenswrapper[4717]: I1007 13:57:57.508451 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:57 crc kubenswrapper[4717]: I1007 13:57:57.508904 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:57 crc kubenswrapper[4717]: I1007 13:57:57.548106 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:58 crc kubenswrapper[4717]: I1007 13:57:58.181928 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 13:57:58 crc kubenswrapper[4717]: I1007 13:57:58.182424 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bgtcz" Oct 07 13:57:59 crc kubenswrapper[4717]: I1007 13:57:59.710866 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:59 crc kubenswrapper[4717]: I1007 13:57:59.710930 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:57:59 crc kubenswrapper[4717]: I1007 13:57:59.751724 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:58:00 crc kubenswrapper[4717]: I1007 13:58:00.008226 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:58:00 crc kubenswrapper[4717]: I1007 13:58:00.008280 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:58:00 crc kubenswrapper[4717]: I1007 13:58:00.050849 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:58:00 crc kubenswrapper[4717]: I1007 13:58:00.194061 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qxh5w" Oct 07 13:58:00 crc kubenswrapper[4717]: I1007 13:58:00.199734 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 13:59:31 crc kubenswrapper[4717]: I1007 13:59:31.609596 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 13:59:31 crc kubenswrapper[4717]: I1007 13:59:31.611219 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.143900 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk"] Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.145344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.147419 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.148196 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.154785 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk"] Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.257717 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrd6q\" (UniqueName: \"kubernetes.io/projected/3120106e-abb9-496f-ba83-81806347c89c-kube-api-access-xrd6q\") pod \"collect-profiles-29330760-525fk\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.258474 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3120106e-abb9-496f-ba83-81806347c89c-config-volume\") pod \"collect-profiles-29330760-525fk\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.258526 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3120106e-abb9-496f-ba83-81806347c89c-secret-volume\") pod \"collect-profiles-29330760-525fk\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.360019 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrd6q\" (UniqueName: \"kubernetes.io/projected/3120106e-abb9-496f-ba83-81806347c89c-kube-api-access-xrd6q\") pod \"collect-profiles-29330760-525fk\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.360289 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3120106e-abb9-496f-ba83-81806347c89c-config-volume\") pod \"collect-profiles-29330760-525fk\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.360376 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3120106e-abb9-496f-ba83-81806347c89c-secret-volume\") pod \"collect-profiles-29330760-525fk\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.361308 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3120106e-abb9-496f-ba83-81806347c89c-config-volume\") pod \"collect-profiles-29330760-525fk\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.367562 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3120106e-abb9-496f-ba83-81806347c89c-secret-volume\") pod \"collect-profiles-29330760-525fk\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.380814 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrd6q\" (UniqueName: \"kubernetes.io/projected/3120106e-abb9-496f-ba83-81806347c89c-kube-api-access-xrd6q\") pod \"collect-profiles-29330760-525fk\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.460299 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.615604 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk"] Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.807967 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" event={"ID":"3120106e-abb9-496f-ba83-81806347c89c","Type":"ContainerStarted","Data":"4029f85a0293606220d499720dfc611e8bf3fd6581f93e370c295104a36a49b5"} Oct 07 14:00:00 crc kubenswrapper[4717]: I1007 14:00:00.808318 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" event={"ID":"3120106e-abb9-496f-ba83-81806347c89c","Type":"ContainerStarted","Data":"2dbf1735e794621453a4870ffaed3e7d3ce70f2884438ed916d6ccfa66d13943"} Oct 07 14:00:01 crc kubenswrapper[4717]: I1007 14:00:01.610298 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:00:01 crc kubenswrapper[4717]: I1007 14:00:01.610359 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:00:01 crc kubenswrapper[4717]: I1007 14:00:01.813483 4717 generic.go:334] "Generic (PLEG): container finished" podID="3120106e-abb9-496f-ba83-81806347c89c" containerID="4029f85a0293606220d499720dfc611e8bf3fd6581f93e370c295104a36a49b5" exitCode=0 Oct 07 14:00:01 crc kubenswrapper[4717]: I1007 14:00:01.813675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" event={"ID":"3120106e-abb9-496f-ba83-81806347c89c","Type":"ContainerDied","Data":"4029f85a0293606220d499720dfc611e8bf3fd6581f93e370c295104a36a49b5"} Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.003587 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.191841 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3120106e-abb9-496f-ba83-81806347c89c-config-volume\") pod \"3120106e-abb9-496f-ba83-81806347c89c\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.191924 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3120106e-abb9-496f-ba83-81806347c89c-secret-volume\") pod \"3120106e-abb9-496f-ba83-81806347c89c\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.191959 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrd6q\" (UniqueName: \"kubernetes.io/projected/3120106e-abb9-496f-ba83-81806347c89c-kube-api-access-xrd6q\") pod \"3120106e-abb9-496f-ba83-81806347c89c\" (UID: \"3120106e-abb9-496f-ba83-81806347c89c\") " Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.192637 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3120106e-abb9-496f-ba83-81806347c89c-config-volume" (OuterVolumeSpecName: "config-volume") pod "3120106e-abb9-496f-ba83-81806347c89c" (UID: "3120106e-abb9-496f-ba83-81806347c89c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.196535 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3120106e-abb9-496f-ba83-81806347c89c-kube-api-access-xrd6q" (OuterVolumeSpecName: "kube-api-access-xrd6q") pod "3120106e-abb9-496f-ba83-81806347c89c" (UID: "3120106e-abb9-496f-ba83-81806347c89c"). InnerVolumeSpecName "kube-api-access-xrd6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.196546 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3120106e-abb9-496f-ba83-81806347c89c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3120106e-abb9-496f-ba83-81806347c89c" (UID: "3120106e-abb9-496f-ba83-81806347c89c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.293499 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3120106e-abb9-496f-ba83-81806347c89c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.293548 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3120106e-abb9-496f-ba83-81806347c89c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.293560 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrd6q\" (UniqueName: \"kubernetes.io/projected/3120106e-abb9-496f-ba83-81806347c89c-kube-api-access-xrd6q\") on node \"crc\" DevicePath \"\"" Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.825085 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" event={"ID":"3120106e-abb9-496f-ba83-81806347c89c","Type":"ContainerDied","Data":"2dbf1735e794621453a4870ffaed3e7d3ce70f2884438ed916d6ccfa66d13943"} Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.825140 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dbf1735e794621453a4870ffaed3e7d3ce70f2884438ed916d6ccfa66d13943" Oct 07 14:00:03 crc kubenswrapper[4717]: I1007 14:00:03.825158 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.199261 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-q4qv4"] Oct 07 14:00:14 crc kubenswrapper[4717]: E1007 14:00:14.199742 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3120106e-abb9-496f-ba83-81806347c89c" containerName="collect-profiles" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.199757 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3120106e-abb9-496f-ba83-81806347c89c" containerName="collect-profiles" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.199850 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3120106e-abb9-496f-ba83-81806347c89c" containerName="collect-profiles" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.200208 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.233103 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.233162 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.233383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-registry-certificates\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.233526 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-bound-sa-token\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.233612 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.233665 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2vmv\" (UniqueName: \"kubernetes.io/projected/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-kube-api-access-x2vmv\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.233843 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-registry-tls\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.234000 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-trusted-ca\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.273479 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-q4qv4"] Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.311214 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.334590 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-registry-certificates\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.334660 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-bound-sa-token\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.334689 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2vmv\" (UniqueName: \"kubernetes.io/projected/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-kube-api-access-x2vmv\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.334721 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-registry-tls\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.334757 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-trusted-ca\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.334780 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.334802 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.335810 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-registry-certificates\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.336727 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-ca-trust-extracted\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.337758 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-trusted-ca\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.342965 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-installation-pull-secrets\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.347497 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-registry-tls\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.350069 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-bound-sa-token\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.352514 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2vmv\" (UniqueName: \"kubernetes.io/projected/49022e96-73f8-4690-a9b0-62a6cfeaa5a0-kube-api-access-x2vmv\") pod \"image-registry-66df7c8f76-q4qv4\" (UID: \"49022e96-73f8-4690-a9b0-62a6cfeaa5a0\") " pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.515134 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.702304 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-q4qv4"] Oct 07 14:00:14 crc kubenswrapper[4717]: I1007 14:00:14.878489 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" event={"ID":"49022e96-73f8-4690-a9b0-62a6cfeaa5a0","Type":"ContainerStarted","Data":"9d1b94474ff172f92c79eb7e439351549e3aeab7a5f7b1a9a72b5e658ee08d78"} Oct 07 14:00:15 crc kubenswrapper[4717]: I1007 14:00:15.886629 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" event={"ID":"49022e96-73f8-4690-a9b0-62a6cfeaa5a0","Type":"ContainerStarted","Data":"c34c695b2ddabb72ae6a2c82aa0754635e8744318fd959e6b5b017b3eea0fe20"} Oct 07 14:00:15 crc kubenswrapper[4717]: I1007 14:00:15.886853 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:15 crc kubenswrapper[4717]: I1007 14:00:15.911504 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" podStartSLOduration=1.911475288 podStartE2EDuration="1.911475288s" podCreationTimestamp="2025-10-07 14:00:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:00:15.903135944 +0000 UTC m=+397.731061746" watchObservedRunningTime="2025-10-07 14:00:15.911475288 +0000 UTC m=+397.739401120" Oct 07 14:00:31 crc kubenswrapper[4717]: I1007 14:00:31.609640 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:00:31 crc kubenswrapper[4717]: I1007 14:00:31.610140 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:00:31 crc kubenswrapper[4717]: I1007 14:00:31.610188 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:00:31 crc kubenswrapper[4717]: I1007 14:00:31.610823 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c301f33aad4e554568debc52a0ab3e2302d1a336901ef90ffaaf63d15ce3a1a"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:00:31 crc kubenswrapper[4717]: I1007 14:00:31.610880 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://5c301f33aad4e554568debc52a0ab3e2302d1a336901ef90ffaaf63d15ce3a1a" gracePeriod=600 Oct 07 14:00:31 crc kubenswrapper[4717]: I1007 14:00:31.974855 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="5c301f33aad4e554568debc52a0ab3e2302d1a336901ef90ffaaf63d15ce3a1a" exitCode=0 Oct 07 14:00:31 crc kubenswrapper[4717]: I1007 14:00:31.974893 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"5c301f33aad4e554568debc52a0ab3e2302d1a336901ef90ffaaf63d15ce3a1a"} Oct 07 14:00:31 crc kubenswrapper[4717]: I1007 14:00:31.975207 4717 scope.go:117] "RemoveContainer" containerID="6d4da9ea7ecc3690fc734334f50c0d95b03fc3a6794cf2ee96686aa267405e89" Oct 07 14:00:32 crc kubenswrapper[4717]: I1007 14:00:32.983439 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"91bfd71de97d3141ca294a4e0d8773e9ff769b86c3768b31cab03dbe97d28d8a"} Oct 07 14:00:34 crc kubenswrapper[4717]: I1007 14:00:34.521870 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-q4qv4" Oct 07 14:00:34 crc kubenswrapper[4717]: I1007 14:00:34.585643 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncngn"] Oct 07 14:00:59 crc kubenswrapper[4717]: I1007 14:00:59.636897 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" podUID="f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" containerName="registry" containerID="cri-o://8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016" gracePeriod=30 Oct 07 14:00:59 crc kubenswrapper[4717]: I1007 14:00:59.955614 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.032761 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dldt5\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-kube-api-access-dldt5\") pod \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.032809 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-certificates\") pod \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.032851 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-installation-pull-secrets\") pod \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.032892 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-trusted-ca\") pod \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.032928 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-bound-sa-token\") pod \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.033085 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.033144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-tls\") pod \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.033196 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-ca-trust-extracted\") pod \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\" (UID: \"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642\") " Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.037691 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.038316 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.039741 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.040391 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.041460 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.041887 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-kube-api-access-dldt5" (OuterVolumeSpecName: "kube-api-access-dldt5") pod "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642"). InnerVolumeSpecName "kube-api-access-dldt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.049580 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.055615 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" (UID: "f3c6c7fc-eddf-402e-b2ce-96ae0cf04642"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.121961 4717 generic.go:334] "Generic (PLEG): container finished" podID="f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" containerID="8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016" exitCode=0 Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.122027 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" event={"ID":"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642","Type":"ContainerDied","Data":"8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016"} Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.122052 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" event={"ID":"f3c6c7fc-eddf-402e-b2ce-96ae0cf04642","Type":"ContainerDied","Data":"8127077191be677d3357eda3f1db046d124cc93cf656bea190c3ea6d4c7d19f8"} Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.122068 4717 scope.go:117] "RemoveContainer" containerID="8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.122164 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncngn" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.134314 4717 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.134350 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.134362 4717 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.134374 4717 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.134385 4717 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.134398 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dldt5\" (UniqueName: \"kubernetes.io/projected/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-kube-api-access-dldt5\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.134408 4717 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.137288 4717 scope.go:117] "RemoveContainer" containerID="8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016" Oct 07 14:01:00 crc kubenswrapper[4717]: E1007 14:01:00.137700 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016\": container with ID starting with 8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016 not found: ID does not exist" containerID="8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.137744 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016"} err="failed to get container status \"8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016\": rpc error: code = NotFound desc = could not find container \"8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016\": container with ID starting with 8a3000f2d5ec25d202a17392e00ba49ca5f71581ecf953451a5359222403e016 not found: ID does not exist" Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.151130 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncngn"] Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.156475 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncngn"] Oct 07 14:01:00 crc kubenswrapper[4717]: I1007 14:01:00.876306 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" path="/var/lib/kubelet/pods/f3c6c7fc-eddf-402e-b2ce-96ae0cf04642/volumes" Oct 07 14:02:39 crc kubenswrapper[4717]: I1007 14:02:39.030261 4717 scope.go:117] "RemoveContainer" containerID="92d51aab409ddba4e138dde39686d8197bb79c6df675753558048fc0109cfc7c" Oct 07 14:03:01 crc kubenswrapper[4717]: I1007 14:03:01.609428 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:03:01 crc kubenswrapper[4717]: I1007 14:03:01.610022 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:03:31 crc kubenswrapper[4717]: I1007 14:03:31.609637 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:03:31 crc kubenswrapper[4717]: I1007 14:03:31.610312 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.538628 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kg4r6"] Oct 07 14:03:38 crc kubenswrapper[4717]: E1007 14:03:38.539200 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" containerName="registry" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.539220 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" containerName="registry" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.539349 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c6c7fc-eddf-402e-b2ce-96ae0cf04642" containerName="registry" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.539769 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-kg4r6" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.542168 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.542224 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.542648 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zls9g" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.552204 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-cjl9f"] Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.554460 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-cjl9f" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.561391 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kg4r6"] Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.564238 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hm725" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.595637 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-z4t9q"] Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.602455 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-z4t9q" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.605985 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-nf696" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.606703 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-z4t9q"] Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.613678 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-cjl9f"] Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.717993 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvmlp\" (UniqueName: \"kubernetes.io/projected/3c320f2d-2ff6-4ca8-824a-c866f83684f5-kube-api-access-lvmlp\") pod \"cert-manager-cainjector-7f985d654d-kg4r6\" (UID: \"3c320f2d-2ff6-4ca8-824a-c866f83684f5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kg4r6" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.718070 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4s9r\" (UniqueName: \"kubernetes.io/projected/2865143d-019f-4b4d-950a-c346856cfb7a-kube-api-access-c4s9r\") pod \"cert-manager-5b446d88c5-cjl9f\" (UID: \"2865143d-019f-4b4d-950a-c346856cfb7a\") " pod="cert-manager/cert-manager-5b446d88c5-cjl9f" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.718276 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxmmn\" (UniqueName: \"kubernetes.io/projected/a9fab481-f23d-4515-a94d-15fac56b0032-kube-api-access-dxmmn\") pod \"cert-manager-webhook-5655c58dd6-z4t9q\" (UID: \"a9fab481-f23d-4515-a94d-15fac56b0032\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-z4t9q" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.819394 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4s9r\" (UniqueName: \"kubernetes.io/projected/2865143d-019f-4b4d-950a-c346856cfb7a-kube-api-access-c4s9r\") pod \"cert-manager-5b446d88c5-cjl9f\" (UID: \"2865143d-019f-4b4d-950a-c346856cfb7a\") " pod="cert-manager/cert-manager-5b446d88c5-cjl9f" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.819642 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxmmn\" (UniqueName: \"kubernetes.io/projected/a9fab481-f23d-4515-a94d-15fac56b0032-kube-api-access-dxmmn\") pod \"cert-manager-webhook-5655c58dd6-z4t9q\" (UID: \"a9fab481-f23d-4515-a94d-15fac56b0032\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-z4t9q" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.819708 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvmlp\" (UniqueName: \"kubernetes.io/projected/3c320f2d-2ff6-4ca8-824a-c866f83684f5-kube-api-access-lvmlp\") pod \"cert-manager-cainjector-7f985d654d-kg4r6\" (UID: \"3c320f2d-2ff6-4ca8-824a-c866f83684f5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kg4r6" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.833755 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.841103 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.847435 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvmlp\" (UniqueName: \"kubernetes.io/projected/3c320f2d-2ff6-4ca8-824a-c866f83684f5-kube-api-access-lvmlp\") pod \"cert-manager-cainjector-7f985d654d-kg4r6\" (UID: \"3c320f2d-2ff6-4ca8-824a-c866f83684f5\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-kg4r6" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.852301 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxmmn\" (UniqueName: \"kubernetes.io/projected/a9fab481-f23d-4515-a94d-15fac56b0032-kube-api-access-dxmmn\") pod \"cert-manager-webhook-5655c58dd6-z4t9q\" (UID: \"a9fab481-f23d-4515-a94d-15fac56b0032\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-z4t9q" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.853579 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4s9r\" (UniqueName: \"kubernetes.io/projected/2865143d-019f-4b4d-950a-c346856cfb7a-kube-api-access-c4s9r\") pod \"cert-manager-5b446d88c5-cjl9f\" (UID: \"2865143d-019f-4b4d-950a-c346856cfb7a\") " pod="cert-manager/cert-manager-5b446d88c5-cjl9f" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.873317 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zls9g" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.882054 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-kg4r6" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.892133 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-hm725" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.900513 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-cjl9f" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.926762 4717 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-nf696" Oct 07 14:03:38 crc kubenswrapper[4717]: I1007 14:03:38.936040 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-z4t9q" Oct 07 14:03:39 crc kubenswrapper[4717]: I1007 14:03:39.069915 4717 scope.go:117] "RemoveContainer" containerID="de1d6211843cc2177133c5563a3cdeb5c73b0729a5b0e961756b2c2595b0f18e" Oct 07 14:03:39 crc kubenswrapper[4717]: I1007 14:03:39.089325 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-cjl9f"] Oct 07 14:03:39 crc kubenswrapper[4717]: I1007 14:03:39.141234 4717 scope.go:117] "RemoveContainer" containerID="f4df93a45316e3ceb5d6dc9e912f587989b17d2004b354027d750a26cc89e0a6" Oct 07 14:03:39 crc kubenswrapper[4717]: W1007 14:03:39.147919 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2865143d_019f_4b4d_950a_c346856cfb7a.slice/crio-36572d0372e5ccd1ca22dd2c0ab8af5367f82bf52c9024298307ba663665854c WatchSource:0}: Error finding container 36572d0372e5ccd1ca22dd2c0ab8af5367f82bf52c9024298307ba663665854c: Status 404 returned error can't find the container with id 36572d0372e5ccd1ca22dd2c0ab8af5367f82bf52c9024298307ba663665854c Oct 07 14:03:39 crc kubenswrapper[4717]: I1007 14:03:39.151224 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:03:39 crc kubenswrapper[4717]: I1007 14:03:39.203861 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-z4t9q"] Oct 07 14:03:39 crc kubenswrapper[4717]: W1007 14:03:39.209575 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9fab481_f23d_4515_a94d_15fac56b0032.slice/crio-8c0900568fa2fa8761d05d619c7139951b433ef0d0f8685a24915cbb085548c1 WatchSource:0}: Error finding container 8c0900568fa2fa8761d05d619c7139951b433ef0d0f8685a24915cbb085548c1: Status 404 returned error can't find the container with id 8c0900568fa2fa8761d05d619c7139951b433ef0d0f8685a24915cbb085548c1 Oct 07 14:03:39 crc kubenswrapper[4717]: I1007 14:03:39.317821 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-kg4r6"] Oct 07 14:03:39 crc kubenswrapper[4717]: W1007 14:03:39.323063 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c320f2d_2ff6_4ca8_824a_c866f83684f5.slice/crio-102598a80eff7f99801447ae3a8a2097948d8a193c072981a10692e7cae9c3b4 WatchSource:0}: Error finding container 102598a80eff7f99801447ae3a8a2097948d8a193c072981a10692e7cae9c3b4: Status 404 returned error can't find the container with id 102598a80eff7f99801447ae3a8a2097948d8a193c072981a10692e7cae9c3b4 Oct 07 14:03:39 crc kubenswrapper[4717]: I1007 14:03:39.981057 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-z4t9q" event={"ID":"a9fab481-f23d-4515-a94d-15fac56b0032","Type":"ContainerStarted","Data":"8c0900568fa2fa8761d05d619c7139951b433ef0d0f8685a24915cbb085548c1"} Oct 07 14:03:39 crc kubenswrapper[4717]: I1007 14:03:39.982606 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-cjl9f" event={"ID":"2865143d-019f-4b4d-950a-c346856cfb7a","Type":"ContainerStarted","Data":"36572d0372e5ccd1ca22dd2c0ab8af5367f82bf52c9024298307ba663665854c"} Oct 07 14:03:39 crc kubenswrapper[4717]: I1007 14:03:39.984196 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-kg4r6" event={"ID":"3c320f2d-2ff6-4ca8-824a-c866f83684f5","Type":"ContainerStarted","Data":"102598a80eff7f99801447ae3a8a2097948d8a193c072981a10692e7cae9c3b4"} Oct 07 14:03:42 crc kubenswrapper[4717]: I1007 14:03:42.998941 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-z4t9q" event={"ID":"a9fab481-f23d-4515-a94d-15fac56b0032","Type":"ContainerStarted","Data":"8ade449cc11925e8f687d84a8763db5e374488d7172c519f0b28292f1790dda6"} Oct 07 14:03:42 crc kubenswrapper[4717]: I1007 14:03:42.999428 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-z4t9q" Oct 07 14:03:43 crc kubenswrapper[4717]: I1007 14:03:43.001551 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-cjl9f" event={"ID":"2865143d-019f-4b4d-950a-c346856cfb7a","Type":"ContainerStarted","Data":"ec7e52abe791090b367c28521a11c3e9c39621ec027f6999954d706b05a112e8"} Oct 07 14:03:43 crc kubenswrapper[4717]: I1007 14:03:43.004322 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-kg4r6" event={"ID":"3c320f2d-2ff6-4ca8-824a-c866f83684f5","Type":"ContainerStarted","Data":"1fb8e76de7cf02cbe560526e2c22e75a57b35e666c8990e086dd5398146222fa"} Oct 07 14:03:43 crc kubenswrapper[4717]: I1007 14:03:43.027120 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-kg4r6" podStartSLOduration=1.8102516039999998 podStartE2EDuration="5.027104623s" podCreationTimestamp="2025-10-07 14:03:38 +0000 UTC" firstStartedPulling="2025-10-07 14:03:39.325417817 +0000 UTC m=+601.153343609" lastFinishedPulling="2025-10-07 14:03:42.542270836 +0000 UTC m=+604.370196628" observedRunningTime="2025-10-07 14:03:43.026190587 +0000 UTC m=+604.854116379" watchObservedRunningTime="2025-10-07 14:03:43.027104623 +0000 UTC m=+604.855030415" Oct 07 14:03:43 crc kubenswrapper[4717]: I1007 14:03:43.028801 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-z4t9q" podStartSLOduration=1.641936477 podStartE2EDuration="5.02879029s" podCreationTimestamp="2025-10-07 14:03:38 +0000 UTC" firstStartedPulling="2025-10-07 14:03:39.212489912 +0000 UTC m=+601.040415704" lastFinishedPulling="2025-10-07 14:03:42.599343725 +0000 UTC m=+604.427269517" observedRunningTime="2025-10-07 14:03:43.014288674 +0000 UTC m=+604.842214466" watchObservedRunningTime="2025-10-07 14:03:43.02879029 +0000 UTC m=+604.856716082" Oct 07 14:03:43 crc kubenswrapper[4717]: I1007 14:03:43.040343 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-cjl9f" podStartSLOduration=1.6482335529999999 podStartE2EDuration="5.040301243s" podCreationTimestamp="2025-10-07 14:03:38 +0000 UTC" firstStartedPulling="2025-10-07 14:03:39.150979728 +0000 UTC m=+600.978905520" lastFinishedPulling="2025-10-07 14:03:42.543047418 +0000 UTC m=+604.370973210" observedRunningTime="2025-10-07 14:03:43.038868373 +0000 UTC m=+604.866794165" watchObservedRunningTime="2025-10-07 14:03:43.040301243 +0000 UTC m=+604.868227035" Oct 07 14:03:48 crc kubenswrapper[4717]: I1007 14:03:48.938857 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-z4t9q" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.226697 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lx6tg"] Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.227614 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovn-controller" containerID="cri-o://bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57" gracePeriod=30 Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.227646 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f" gracePeriod=30 Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.227734 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="northd" containerID="cri-o://ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b" gracePeriod=30 Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.227770 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovn-acl-logging" containerID="cri-o://7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1" gracePeriod=30 Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.227617 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="nbdb" containerID="cri-o://544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531" gracePeriod=30 Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.227867 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="sbdb" containerID="cri-o://65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84" gracePeriod=30 Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.227914 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="kube-rbac-proxy-node" containerID="cri-o://2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91" gracePeriod=30 Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.257608 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" containerID="cri-o://0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf" gracePeriod=30 Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.498913 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/3.log" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.501884 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovn-acl-logging/0.log" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.502575 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovn-controller/0.log" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.503292 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.559354 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-netns\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.559450 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-script-lib\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.559458 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.559495 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-systemd-units\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.559595 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.559644 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560262 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560388 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-openvswitch\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560483 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-config\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560578 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-node-log\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560658 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-env-overrides\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560725 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-bin\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560792 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-var-lib-openvswitch\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560836 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-etc-openvswitch\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560896 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-kubelet\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560957 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovn-node-metrics-cert\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.560994 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-netd\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.561065 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.561162 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-slash\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.561239 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-ovn-kubernetes\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.561301 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kq2l\" (UniqueName: \"kubernetes.io/projected/d24a81c8-811e-41bf-ab5d-48590bc1e8df-kube-api-access-2kq2l\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.561340 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-ovn\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.561396 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-systemd\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.561467 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-log-socket\") pod \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\" (UID: \"d24a81c8-811e-41bf-ab5d-48590bc1e8df\") " Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.562983 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563063 4717 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563089 4717 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563137 4717 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563280 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563324 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563313 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563395 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563424 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563560 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-log-socket" (OuterVolumeSpecName: "log-socket") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563692 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563829 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.563958 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.564085 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-node-log" (OuterVolumeSpecName: "node-log") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.564627 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.564883 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.565172 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-slash" (OuterVolumeSpecName: "host-slash") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.566283 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dkr8m"] Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.566907 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="kubecfg-setup" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.566925 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="kubecfg-setup" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.566946 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="northd" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.566954 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="northd" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.566964 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovn-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.566972 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovn-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.566989 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="nbdb" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.566997 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="nbdb" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.569171 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569194 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.569211 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovn-acl-logging" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569220 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovn-acl-logging" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.569240 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="kube-rbac-proxy-node" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569249 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="kube-rbac-proxy-node" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.569270 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569278 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.569288 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569296 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.569311 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="sbdb" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569320 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="sbdb" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.569344 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569353 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569766 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569782 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovn-acl-logging" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569804 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569813 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569827 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovn-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569836 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="kube-rbac-proxy-node" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569851 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="sbdb" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569870 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="nbdb" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569879 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="northd" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569888 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.569903 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="kube-rbac-proxy-ovn-metrics" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.570290 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.570307 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: E1007 14:03:49.570325 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.570340 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.570682 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerName="ovnkube-controller" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.575593 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.576423 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d24a81c8-811e-41bf-ab5d-48590bc1e8df-kube-api-access-2kq2l" (OuterVolumeSpecName: "kube-api-access-2kq2l") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "kube-api-access-2kq2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.578367 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.583523 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d24a81c8-811e-41bf-ab5d-48590bc1e8df" (UID: "d24a81c8-811e-41bf-ab5d-48590bc1e8df"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.663570 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-log-socket\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.663613 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.663633 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-systemd-units\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.663786 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-kubelet\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664061 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-var-lib-openvswitch\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664111 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxzpw\" (UniqueName: \"kubernetes.io/projected/ec98cfb1-d92b-4421-8453-7dd595f8bb00-kube-api-access-kxzpw\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-cni-bin\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664194 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec98cfb1-d92b-4421-8453-7dd595f8bb00-ovn-node-metrics-cert\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664228 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-run-netns\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664248 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec98cfb1-d92b-4421-8453-7dd595f8bb00-ovnkube-config\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664385 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-run-ovn-kubernetes\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664458 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec98cfb1-d92b-4421-8453-7dd595f8bb00-ovnkube-script-lib\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664513 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-slash\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664558 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec98cfb1-d92b-4421-8453-7dd595f8bb00-env-overrides\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664601 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-etc-openvswitch\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664629 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-run-openvswitch\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664648 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-run-systemd\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-cni-netd\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-run-ovn\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664802 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-node-log\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664942 4717 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.664967 4717 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-node-log\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665048 4717 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d24a81c8-811e-41bf-ab5d-48590bc1e8df-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665061 4717 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665073 4717 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665084 4717 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665094 4717 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665104 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d24a81c8-811e-41bf-ab5d-48590bc1e8df-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665117 4717 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665131 4717 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665141 4717 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-slash\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665153 4717 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665166 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kq2l\" (UniqueName: \"kubernetes.io/projected/d24a81c8-811e-41bf-ab5d-48590bc1e8df-kube-api-access-2kq2l\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665177 4717 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665187 4717 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.665196 4717 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d24a81c8-811e-41bf-ab5d-48590bc1e8df-log-socket\") on node \"crc\" DevicePath \"\"" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.766688 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-run-systemd\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.766776 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-cni-netd\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.766824 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-run-ovn\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.766862 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-node-log\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.766871 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-run-systemd\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.766905 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-systemd-units\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.766979 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-node-log\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.767083 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-systemd-units\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.767132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-log-socket\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.766956 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-run-ovn\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.766952 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-cni-netd\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.766996 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-log-socket\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.767342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.767395 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-kubelet\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.767457 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-var-lib-openvswitch\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.767532 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-kubelet\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.767587 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-var-lib-openvswitch\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.767502 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.767555 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxzpw\" (UniqueName: \"kubernetes.io/projected/ec98cfb1-d92b-4421-8453-7dd595f8bb00-kube-api-access-kxzpw\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.767859 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-cni-bin\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.768309 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-cni-bin\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.768394 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec98cfb1-d92b-4421-8453-7dd595f8bb00-ovn-node-metrics-cert\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.768510 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-run-netns\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.768552 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec98cfb1-d92b-4421-8453-7dd595f8bb00-ovnkube-config\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.768611 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-run-netns\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769114 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-run-ovn-kubernetes\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec98cfb1-d92b-4421-8453-7dd595f8bb00-ovnkube-script-lib\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769274 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec98cfb1-d92b-4421-8453-7dd595f8bb00-env-overrides\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769330 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-slash\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769228 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-run-ovn-kubernetes\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769372 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-etc-openvswitch\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769413 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-host-slash\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769447 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-run-openvswitch\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769471 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-etc-openvswitch\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769554 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ec98cfb1-d92b-4421-8453-7dd595f8bb00-ovnkube-config\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.769617 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ec98cfb1-d92b-4421-8453-7dd595f8bb00-run-openvswitch\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.770150 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ec98cfb1-d92b-4421-8453-7dd595f8bb00-env-overrides\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.770228 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ec98cfb1-d92b-4421-8453-7dd595f8bb00-ovnkube-script-lib\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.772939 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ec98cfb1-d92b-4421-8453-7dd595f8bb00-ovn-node-metrics-cert\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.783867 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxzpw\" (UniqueName: \"kubernetes.io/projected/ec98cfb1-d92b-4421-8453-7dd595f8bb00-kube-api-access-kxzpw\") pod \"ovnkube-node-dkr8m\" (UID: \"ec98cfb1-d92b-4421-8453-7dd595f8bb00\") " pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:49 crc kubenswrapper[4717]: I1007 14:03:49.911817 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.044030 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-shhlh_bf0d43cd-2fb1-490e-9de4-db923141bd43/kube-multus/2.log" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.045336 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-shhlh_bf0d43cd-2fb1-490e-9de4-db923141bd43/kube-multus/1.log" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.045388 4717 generic.go:334] "Generic (PLEG): container finished" podID="bf0d43cd-2fb1-490e-9de4-db923141bd43" containerID="9be8cb1b9e7ff40152ee18884bc2a3f032539835fe1d90eae801fbf1487d0991" exitCode=2 Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.045471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-shhlh" event={"ID":"bf0d43cd-2fb1-490e-9de4-db923141bd43","Type":"ContainerDied","Data":"9be8cb1b9e7ff40152ee18884bc2a3f032539835fe1d90eae801fbf1487d0991"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.045527 4717 scope.go:117] "RemoveContainer" containerID="c6faf6a7bcb05188af3dc898a4e4ea0260acac9175d12a2f3f5b4104b512a0a0" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.046837 4717 scope.go:117] "RemoveContainer" containerID="9be8cb1b9e7ff40152ee18884bc2a3f032539835fe1d90eae801fbf1487d0991" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.047345 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-shhlh_openshift-multus(bf0d43cd-2fb1-490e-9de4-db923141bd43)\"" pod="openshift-multus/multus-shhlh" podUID="bf0d43cd-2fb1-490e-9de4-db923141bd43" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.047962 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" event={"ID":"ec98cfb1-d92b-4421-8453-7dd595f8bb00","Type":"ContainerStarted","Data":"c4d4f82621049f87f4c1bc1109045f2bcad5f0da250b8f7f49bc9c8dae5c00dc"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.056610 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovnkube-controller/3.log" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.060302 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovn-acl-logging/0.log" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061161 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lx6tg_d24a81c8-811e-41bf-ab5d-48590bc1e8df/ovn-controller/0.log" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061633 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf" exitCode=0 Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061657 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84" exitCode=0 Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061666 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531" exitCode=0 Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061675 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b" exitCode=0 Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061683 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f" exitCode=0 Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061690 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91" exitCode=0 Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061697 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1" exitCode=143 Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061704 4717 generic.go:334] "Generic (PLEG): container finished" podID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" containerID="bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57" exitCode=143 Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061727 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061760 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061785 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061796 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061808 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061820 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061831 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061838 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061844 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061849 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061857 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061862 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061868 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061873 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061878 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061885 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061893 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061899 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061904 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061910 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061915 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061922 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061929 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061936 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061941 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061947 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061957 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061964 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061974 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061980 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061986 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.061991 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062228 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062236 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062242 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062248 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062253 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062261 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" event={"ID":"d24a81c8-811e-41bf-ab5d-48590bc1e8df","Type":"ContainerDied","Data":"d80b15f81f211d766ee5da739eff9b66f4b4b1a380d50643eae287eb9f93f8e1"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062270 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062277 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062283 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062290 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062296 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062302 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062307 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062313 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062310 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lx6tg" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.062318 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.063483 4717 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65"} Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.083496 4717 scope.go:117] "RemoveContainer" containerID="0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.105603 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lx6tg"] Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.108040 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lx6tg"] Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.156603 4717 scope.go:117] "RemoveContainer" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.176607 4717 scope.go:117] "RemoveContainer" containerID="65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.190427 4717 scope.go:117] "RemoveContainer" containerID="544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.205239 4717 scope.go:117] "RemoveContainer" containerID="ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.232149 4717 scope.go:117] "RemoveContainer" containerID="050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.248767 4717 scope.go:117] "RemoveContainer" containerID="2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.269141 4717 scope.go:117] "RemoveContainer" containerID="7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.285159 4717 scope.go:117] "RemoveContainer" containerID="bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.298093 4717 scope.go:117] "RemoveContainer" containerID="7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.325636 4717 scope.go:117] "RemoveContainer" containerID="0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.326120 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf\": container with ID starting with 0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf not found: ID does not exist" containerID="0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.326164 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf"} err="failed to get container status \"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf\": rpc error: code = NotFound desc = could not find container \"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf\": container with ID starting with 0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.326189 4717 scope.go:117] "RemoveContainer" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.326562 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\": container with ID starting with 3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30 not found: ID does not exist" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.326645 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30"} err="failed to get container status \"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\": rpc error: code = NotFound desc = could not find container \"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\": container with ID starting with 3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.326735 4717 scope.go:117] "RemoveContainer" containerID="65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.327222 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\": container with ID starting with 65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84 not found: ID does not exist" containerID="65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.327262 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84"} err="failed to get container status \"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\": rpc error: code = NotFound desc = could not find container \"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\": container with ID starting with 65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.327291 4717 scope.go:117] "RemoveContainer" containerID="544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.327642 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\": container with ID starting with 544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531 not found: ID does not exist" containerID="544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.327671 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531"} err="failed to get container status \"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\": rpc error: code = NotFound desc = could not find container \"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\": container with ID starting with 544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.327699 4717 scope.go:117] "RemoveContainer" containerID="ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.327981 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\": container with ID starting with ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b not found: ID does not exist" containerID="ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.328021 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b"} err="failed to get container status \"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\": rpc error: code = NotFound desc = could not find container \"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\": container with ID starting with ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.328036 4717 scope.go:117] "RemoveContainer" containerID="050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.328323 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\": container with ID starting with 050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f not found: ID does not exist" containerID="050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.328379 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f"} err="failed to get container status \"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\": rpc error: code = NotFound desc = could not find container \"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\": container with ID starting with 050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.328411 4717 scope.go:117] "RemoveContainer" containerID="2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.328892 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\": container with ID starting with 2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91 not found: ID does not exist" containerID="2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.328923 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91"} err="failed to get container status \"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\": rpc error: code = NotFound desc = could not find container \"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\": container with ID starting with 2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.328941 4717 scope.go:117] "RemoveContainer" containerID="7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.329379 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\": container with ID starting with 7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1 not found: ID does not exist" containerID="7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.329445 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1"} err="failed to get container status \"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\": rpc error: code = NotFound desc = could not find container \"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\": container with ID starting with 7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.329495 4717 scope.go:117] "RemoveContainer" containerID="bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.329844 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\": container with ID starting with bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57 not found: ID does not exist" containerID="bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.329883 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57"} err="failed to get container status \"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\": rpc error: code = NotFound desc = could not find container \"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\": container with ID starting with bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.329908 4717 scope.go:117] "RemoveContainer" containerID="7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65" Oct 07 14:03:50 crc kubenswrapper[4717]: E1007 14:03:50.330306 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\": container with ID starting with 7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65 not found: ID does not exist" containerID="7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.330348 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65"} err="failed to get container status \"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\": rpc error: code = NotFound desc = could not find container \"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\": container with ID starting with 7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.330381 4717 scope.go:117] "RemoveContainer" containerID="0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.330764 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf"} err="failed to get container status \"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf\": rpc error: code = NotFound desc = could not find container \"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf\": container with ID starting with 0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.330820 4717 scope.go:117] "RemoveContainer" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.331590 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30"} err="failed to get container status \"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\": rpc error: code = NotFound desc = could not find container \"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\": container with ID starting with 3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.331635 4717 scope.go:117] "RemoveContainer" containerID="65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.331971 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84"} err="failed to get container status \"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\": rpc error: code = NotFound desc = could not find container \"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\": container with ID starting with 65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.332040 4717 scope.go:117] "RemoveContainer" containerID="544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.332380 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531"} err="failed to get container status \"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\": rpc error: code = NotFound desc = could not find container \"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\": container with ID starting with 544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.332429 4717 scope.go:117] "RemoveContainer" containerID="ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.332848 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b"} err="failed to get container status \"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\": rpc error: code = NotFound desc = could not find container \"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\": container with ID starting with ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.332891 4717 scope.go:117] "RemoveContainer" containerID="050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.333207 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f"} err="failed to get container status \"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\": rpc error: code = NotFound desc = could not find container \"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\": container with ID starting with 050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.333255 4717 scope.go:117] "RemoveContainer" containerID="2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.333944 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91"} err="failed to get container status \"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\": rpc error: code = NotFound desc = could not find container \"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\": container with ID starting with 2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.333990 4717 scope.go:117] "RemoveContainer" containerID="7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.334509 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1"} err="failed to get container status \"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\": rpc error: code = NotFound desc = could not find container \"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\": container with ID starting with 7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.334557 4717 scope.go:117] "RemoveContainer" containerID="bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.335252 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57"} err="failed to get container status \"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\": rpc error: code = NotFound desc = could not find container \"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\": container with ID starting with bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.335298 4717 scope.go:117] "RemoveContainer" containerID="7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.335650 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65"} err="failed to get container status \"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\": rpc error: code = NotFound desc = could not find container \"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\": container with ID starting with 7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.335690 4717 scope.go:117] "RemoveContainer" containerID="0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.336083 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf"} err="failed to get container status \"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf\": rpc error: code = NotFound desc = could not find container \"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf\": container with ID starting with 0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.336132 4717 scope.go:117] "RemoveContainer" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.337321 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30"} err="failed to get container status \"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\": rpc error: code = NotFound desc = could not find container \"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\": container with ID starting with 3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.337366 4717 scope.go:117] "RemoveContainer" containerID="65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.338179 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84"} err="failed to get container status \"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\": rpc error: code = NotFound desc = could not find container \"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\": container with ID starting with 65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.338222 4717 scope.go:117] "RemoveContainer" containerID="544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.338681 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531"} err="failed to get container status \"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\": rpc error: code = NotFound desc = could not find container \"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\": container with ID starting with 544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.338728 4717 scope.go:117] "RemoveContainer" containerID="ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.339474 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b"} err="failed to get container status \"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\": rpc error: code = NotFound desc = could not find container \"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\": container with ID starting with ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.339533 4717 scope.go:117] "RemoveContainer" containerID="050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.339935 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f"} err="failed to get container status \"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\": rpc error: code = NotFound desc = could not find container \"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\": container with ID starting with 050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.339979 4717 scope.go:117] "RemoveContainer" containerID="2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.340547 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91"} err="failed to get container status \"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\": rpc error: code = NotFound desc = could not find container \"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\": container with ID starting with 2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.340588 4717 scope.go:117] "RemoveContainer" containerID="7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.341070 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1"} err="failed to get container status \"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\": rpc error: code = NotFound desc = could not find container \"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\": container with ID starting with 7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.341133 4717 scope.go:117] "RemoveContainer" containerID="bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.341713 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57"} err="failed to get container status \"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\": rpc error: code = NotFound desc = could not find container \"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\": container with ID starting with bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.341754 4717 scope.go:117] "RemoveContainer" containerID="7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.342148 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65"} err="failed to get container status \"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\": rpc error: code = NotFound desc = could not find container \"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\": container with ID starting with 7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.342188 4717 scope.go:117] "RemoveContainer" containerID="0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.342525 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf"} err="failed to get container status \"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf\": rpc error: code = NotFound desc = could not find container \"0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf\": container with ID starting with 0d29447b4081211708a2c0e497b38c99461e609a7a83cc06084e500f822be7bf not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.342569 4717 scope.go:117] "RemoveContainer" containerID="3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.342916 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30"} err="failed to get container status \"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\": rpc error: code = NotFound desc = could not find container \"3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30\": container with ID starting with 3468ed891cd07a28b6dae5059e56b749d4c2ee8b5689d648e951f7235d147b30 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.342958 4717 scope.go:117] "RemoveContainer" containerID="65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.343422 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84"} err="failed to get container status \"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\": rpc error: code = NotFound desc = could not find container \"65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84\": container with ID starting with 65b1d161595e1af7c169f8f28f2cceb2b62de96aa6b24ee063d8c6079e723b84 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.343461 4717 scope.go:117] "RemoveContainer" containerID="544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.343933 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531"} err="failed to get container status \"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\": rpc error: code = NotFound desc = could not find container \"544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531\": container with ID starting with 544188a6da34e57c9086ff78e5d5bb346f6a71875476c0301e2fc9e3300c3531 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.343998 4717 scope.go:117] "RemoveContainer" containerID="ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.344523 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b"} err="failed to get container status \"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\": rpc error: code = NotFound desc = could not find container \"ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b\": container with ID starting with ce99aaeeab49ad9a6138944212326c08d1d968ca6a8b130eea23f68e5c0fa69b not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.344575 4717 scope.go:117] "RemoveContainer" containerID="050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.344963 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f"} err="failed to get container status \"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\": rpc error: code = NotFound desc = could not find container \"050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f\": container with ID starting with 050d027d7e4b113e9b23e0b0e2bc1a9d7bf6c8c694c7fbc75f30a14a52ee566f not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.344990 4717 scope.go:117] "RemoveContainer" containerID="2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.345354 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91"} err="failed to get container status \"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\": rpc error: code = NotFound desc = could not find container \"2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91\": container with ID starting with 2ab2dcf287e01974ba9840d54b1dab2651b7aa3788616173307840f5c729ae91 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.345380 4717 scope.go:117] "RemoveContainer" containerID="7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.345658 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1"} err="failed to get container status \"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\": rpc error: code = NotFound desc = could not find container \"7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1\": container with ID starting with 7105aa4922145d0ccc5cd0deb2f2245773db92c2c2442d5890ef92d223720ce1 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.345683 4717 scope.go:117] "RemoveContainer" containerID="bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.346067 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57"} err="failed to get container status \"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\": rpc error: code = NotFound desc = could not find container \"bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57\": container with ID starting with bdddee7623abd8a3a3c52dfe4141581f9dbdc440b142c907b49382e375a86f57 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.346135 4717 scope.go:117] "RemoveContainer" containerID="7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.346490 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65"} err="failed to get container status \"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\": rpc error: code = NotFound desc = could not find container \"7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65\": container with ID starting with 7b9de0d64c43063cda10628d11df31ba73d48daa3165f80cfcbf43df0fdaef65 not found: ID does not exist" Oct 07 14:03:50 crc kubenswrapper[4717]: I1007 14:03:50.880165 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d24a81c8-811e-41bf-ab5d-48590bc1e8df" path="/var/lib/kubelet/pods/d24a81c8-811e-41bf-ab5d-48590bc1e8df/volumes" Oct 07 14:03:51 crc kubenswrapper[4717]: I1007 14:03:51.073199 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-shhlh_bf0d43cd-2fb1-490e-9de4-db923141bd43/kube-multus/2.log" Oct 07 14:03:51 crc kubenswrapper[4717]: I1007 14:03:51.074869 4717 generic.go:334] "Generic (PLEG): container finished" podID="ec98cfb1-d92b-4421-8453-7dd595f8bb00" containerID="6c9e5b3837c6833550ab87eeb7a08725ba2696bace9c8a8c24212b1075259336" exitCode=0 Oct 07 14:03:51 crc kubenswrapper[4717]: I1007 14:03:51.074939 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" event={"ID":"ec98cfb1-d92b-4421-8453-7dd595f8bb00","Type":"ContainerDied","Data":"6c9e5b3837c6833550ab87eeb7a08725ba2696bace9c8a8c24212b1075259336"} Oct 07 14:03:52 crc kubenswrapper[4717]: I1007 14:03:52.082953 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" event={"ID":"ec98cfb1-d92b-4421-8453-7dd595f8bb00","Type":"ContainerStarted","Data":"044ce1347305a18227ced1d68828dd9042beeeb412573ad09aa7f282d038e359"} Oct 07 14:03:52 crc kubenswrapper[4717]: I1007 14:03:52.083320 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" event={"ID":"ec98cfb1-d92b-4421-8453-7dd595f8bb00","Type":"ContainerStarted","Data":"b6d0f1f64a319ee42fc118d605d074b2b3a5b4dffba6e5f3386acb17163652a5"} Oct 07 14:03:52 crc kubenswrapper[4717]: I1007 14:03:52.083333 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" event={"ID":"ec98cfb1-d92b-4421-8453-7dd595f8bb00","Type":"ContainerStarted","Data":"b0ad5379495f17da9818bbc66f2dbf87b3adcac76cbad65fd717f7a5b2a4d8ce"} Oct 07 14:03:52 crc kubenswrapper[4717]: I1007 14:03:52.083347 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" event={"ID":"ec98cfb1-d92b-4421-8453-7dd595f8bb00","Type":"ContainerStarted","Data":"021ca0a1c23e00cc32f824c604f615509c56b00168507d4cb6759b395f224882"} Oct 07 14:03:52 crc kubenswrapper[4717]: I1007 14:03:52.083356 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" event={"ID":"ec98cfb1-d92b-4421-8453-7dd595f8bb00","Type":"ContainerStarted","Data":"c47a5f25bce1b8f1c06a9fc123a9479efc175051b7c001266d618735b8996efc"} Oct 07 14:03:52 crc kubenswrapper[4717]: I1007 14:03:52.083364 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" event={"ID":"ec98cfb1-d92b-4421-8453-7dd595f8bb00","Type":"ContainerStarted","Data":"90fa6741acaa19faced3198fbfbb0aa026d37fd4eaa5752ce51407b9b5fcc1b0"} Oct 07 14:03:54 crc kubenswrapper[4717]: I1007 14:03:54.094454 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" event={"ID":"ec98cfb1-d92b-4421-8453-7dd595f8bb00","Type":"ContainerStarted","Data":"72de611764d962bf344b9d23347d56ae22a20453e7697acba0fdfa725835d76b"} Oct 07 14:03:57 crc kubenswrapper[4717]: I1007 14:03:57.111335 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" event={"ID":"ec98cfb1-d92b-4421-8453-7dd595f8bb00","Type":"ContainerStarted","Data":"1d116b90634f5b6eb446af04aab54ae7507dae45baeac0a561dba3454e8045f4"} Oct 07 14:03:57 crc kubenswrapper[4717]: I1007 14:03:57.111868 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:57 crc kubenswrapper[4717]: I1007 14:03:57.111880 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:57 crc kubenswrapper[4717]: I1007 14:03:57.111888 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:57 crc kubenswrapper[4717]: I1007 14:03:57.136303 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:57 crc kubenswrapper[4717]: I1007 14:03:57.136621 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:03:57 crc kubenswrapper[4717]: I1007 14:03:57.143144 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" podStartSLOduration=8.143123591 podStartE2EDuration="8.143123591s" podCreationTimestamp="2025-10-07 14:03:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:03:57.139033287 +0000 UTC m=+618.966959089" watchObservedRunningTime="2025-10-07 14:03:57.143123591 +0000 UTC m=+618.971049383" Oct 07 14:04:01 crc kubenswrapper[4717]: I1007 14:04:01.609869 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:04:01 crc kubenswrapper[4717]: I1007 14:04:01.610178 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:04:01 crc kubenswrapper[4717]: I1007 14:04:01.610219 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:04:01 crc kubenswrapper[4717]: I1007 14:04:01.610694 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91bfd71de97d3141ca294a4e0d8773e9ff769b86c3768b31cab03dbe97d28d8a"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:04:01 crc kubenswrapper[4717]: I1007 14:04:01.610741 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://91bfd71de97d3141ca294a4e0d8773e9ff769b86c3768b31cab03dbe97d28d8a" gracePeriod=600 Oct 07 14:04:01 crc kubenswrapper[4717]: I1007 14:04:01.868067 4717 scope.go:117] "RemoveContainer" containerID="9be8cb1b9e7ff40152ee18884bc2a3f032539835fe1d90eae801fbf1487d0991" Oct 07 14:04:01 crc kubenswrapper[4717]: E1007 14:04:01.868483 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-shhlh_openshift-multus(bf0d43cd-2fb1-490e-9de4-db923141bd43)\"" pod="openshift-multus/multus-shhlh" podUID="bf0d43cd-2fb1-490e-9de4-db923141bd43" Oct 07 14:04:02 crc kubenswrapper[4717]: I1007 14:04:02.137633 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="91bfd71de97d3141ca294a4e0d8773e9ff769b86c3768b31cab03dbe97d28d8a" exitCode=0 Oct 07 14:04:02 crc kubenswrapper[4717]: I1007 14:04:02.137674 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"91bfd71de97d3141ca294a4e0d8773e9ff769b86c3768b31cab03dbe97d28d8a"} Oct 07 14:04:02 crc kubenswrapper[4717]: I1007 14:04:02.137699 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"7391daae6696cb6ca21332fa8ceb60a843dc3173f8674514bf587cc8767d21d3"} Oct 07 14:04:02 crc kubenswrapper[4717]: I1007 14:04:02.137715 4717 scope.go:117] "RemoveContainer" containerID="5c301f33aad4e554568debc52a0ab3e2302d1a336901ef90ffaaf63d15ce3a1a" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.297668 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.299088 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.301170 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.301197 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.301720 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-sgrxg" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.476981 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-run\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.477386 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjv8h\" (UniqueName: \"kubernetes.io/projected/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-kube-api-access-jjv8h\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.477415 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-data\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.477447 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-log\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.577998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-run\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.578053 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjv8h\" (UniqueName: \"kubernetes.io/projected/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-kube-api-access-jjv8h\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.578071 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-data\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.578466 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-log\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.578783 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-data\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.578821 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-log\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.578819 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-run\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.603621 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjv8h\" (UniqueName: \"kubernetes.io/projected/cd7e0d52-4d10-4898-8949-7f3dc9875fe9-kube-api-access-jjv8h\") pod \"ceph\" (UID: \"cd7e0d52-4d10-4898-8949-7f3dc9875fe9\") " pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: I1007 14:04:12.614075 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Oct 07 14:04:12 crc kubenswrapper[4717]: W1007 14:04:12.638864 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd7e0d52_4d10_4898_8949_7f3dc9875fe9.slice/crio-c29117e5d32133ea760915398176ce5a088c27f0e61eb5f9b56b5c8cff4fb2f0 WatchSource:0}: Error finding container c29117e5d32133ea760915398176ce5a088c27f0e61eb5f9b56b5c8cff4fb2f0: Status 404 returned error can't find the container with id c29117e5d32133ea760915398176ce5a088c27f0e61eb5f9b56b5c8cff4fb2f0 Oct 07 14:04:13 crc kubenswrapper[4717]: I1007 14:04:13.190907 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"cd7e0d52-4d10-4898-8949-7f3dc9875fe9","Type":"ContainerStarted","Data":"c29117e5d32133ea760915398176ce5a088c27f0e61eb5f9b56b5c8cff4fb2f0"} Oct 07 14:04:14 crc kubenswrapper[4717]: I1007 14:04:14.868828 4717 scope.go:117] "RemoveContainer" containerID="9be8cb1b9e7ff40152ee18884bc2a3f032539835fe1d90eae801fbf1487d0991" Oct 07 14:04:17 crc kubenswrapper[4717]: I1007 14:04:17.211847 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-shhlh_bf0d43cd-2fb1-490e-9de4-db923141bd43/kube-multus/2.log" Oct 07 14:04:17 crc kubenswrapper[4717]: I1007 14:04:17.212184 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-shhlh" event={"ID":"bf0d43cd-2fb1-490e-9de4-db923141bd43","Type":"ContainerStarted","Data":"9b78c93b281aca4829ed111a7ab9173f8803fa7546892b9f79d3d7757e58871e"} Oct 07 14:04:19 crc kubenswrapper[4717]: I1007 14:04:19.936935 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dkr8m" Oct 07 14:04:30 crc kubenswrapper[4717]: I1007 14:04:30.284152 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"cd7e0d52-4d10-4898-8949-7f3dc9875fe9","Type":"ContainerStarted","Data":"96d2e149c133fbc5f1583ab39a055e8f6e3f2ef420a611e8df251d95acb1e1fb"} Oct 07 14:04:30 crc kubenswrapper[4717]: I1007 14:04:30.307122 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=1.507114288 podStartE2EDuration="18.307107117s" podCreationTimestamp="2025-10-07 14:04:12 +0000 UTC" firstStartedPulling="2025-10-07 14:04:12.64085577 +0000 UTC m=+634.468781562" lastFinishedPulling="2025-10-07 14:04:29.440848599 +0000 UTC m=+651.268774391" observedRunningTime="2025-10-07 14:04:30.302644992 +0000 UTC m=+652.130570794" watchObservedRunningTime="2025-10-07 14:04:30.307107117 +0000 UTC m=+652.135032899" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.295389 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b"] Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.297037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.299397 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.306397 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b"] Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.387228 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.387282 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qs8w\" (UniqueName: \"kubernetes.io/projected/d11c6b3d-b42a-4487-8149-216b5b9b2afd-kube-api-access-9qs8w\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.387373 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.488134 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.488361 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qs8w\" (UniqueName: \"kubernetes.io/projected/d11c6b3d-b42a-4487-8149-216b5b9b2afd-kube-api-access-9qs8w\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.488494 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.488571 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.489055 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.505478 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qs8w\" (UniqueName: \"kubernetes.io/projected/d11c6b3d-b42a-4487-8149-216b5b9b2afd-kube-api-access-9qs8w\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:52 crc kubenswrapper[4717]: I1007 14:05:52.654714 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:53 crc kubenswrapper[4717]: I1007 14:05:53.026706 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b"] Oct 07 14:05:53 crc kubenswrapper[4717]: I1007 14:05:53.741903 4717 generic.go:334] "Generic (PLEG): container finished" podID="d11c6b3d-b42a-4487-8149-216b5b9b2afd" containerID="e680b578587a057bbef643128989768177dccd73d1fd6ee58e7fc062e91e0afe" exitCode=0 Oct 07 14:05:53 crc kubenswrapper[4717]: I1007 14:05:53.741973 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" event={"ID":"d11c6b3d-b42a-4487-8149-216b5b9b2afd","Type":"ContainerDied","Data":"e680b578587a057bbef643128989768177dccd73d1fd6ee58e7fc062e91e0afe"} Oct 07 14:05:53 crc kubenswrapper[4717]: I1007 14:05:53.743135 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" event={"ID":"d11c6b3d-b42a-4487-8149-216b5b9b2afd","Type":"ContainerStarted","Data":"76aa3310f2fc931d8dfe68abd151f7c02b2bc67d99260b949ec0cebcd99418cd"} Oct 07 14:05:55 crc kubenswrapper[4717]: I1007 14:05:55.754378 4717 generic.go:334] "Generic (PLEG): container finished" podID="d11c6b3d-b42a-4487-8149-216b5b9b2afd" containerID="469ce5fe37619ddf96699444fca079dff302e662ccf3b5b0b49cbf6b98630ccf" exitCode=0 Oct 07 14:05:55 crc kubenswrapper[4717]: I1007 14:05:55.754488 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" event={"ID":"d11c6b3d-b42a-4487-8149-216b5b9b2afd","Type":"ContainerDied","Data":"469ce5fe37619ddf96699444fca079dff302e662ccf3b5b0b49cbf6b98630ccf"} Oct 07 14:05:56 crc kubenswrapper[4717]: I1007 14:05:56.767055 4717 generic.go:334] "Generic (PLEG): container finished" podID="d11c6b3d-b42a-4487-8149-216b5b9b2afd" containerID="e2deb5a5312e3686819fc26a78af1a8f6afb1a610f3d432207b266d2ff86433f" exitCode=0 Oct 07 14:05:56 crc kubenswrapper[4717]: I1007 14:05:56.767110 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" event={"ID":"d11c6b3d-b42a-4487-8149-216b5b9b2afd","Type":"ContainerDied","Data":"e2deb5a5312e3686819fc26a78af1a8f6afb1a610f3d432207b266d2ff86433f"} Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.027219 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.162188 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-util\") pod \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.162275 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qs8w\" (UniqueName: \"kubernetes.io/projected/d11c6b3d-b42a-4487-8149-216b5b9b2afd-kube-api-access-9qs8w\") pod \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.162485 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-bundle\") pod \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\" (UID: \"d11c6b3d-b42a-4487-8149-216b5b9b2afd\") " Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.163100 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-bundle" (OuterVolumeSpecName: "bundle") pod "d11c6b3d-b42a-4487-8149-216b5b9b2afd" (UID: "d11c6b3d-b42a-4487-8149-216b5b9b2afd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.168346 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c6b3d-b42a-4487-8149-216b5b9b2afd-kube-api-access-9qs8w" (OuterVolumeSpecName: "kube-api-access-9qs8w") pod "d11c6b3d-b42a-4487-8149-216b5b9b2afd" (UID: "d11c6b3d-b42a-4487-8149-216b5b9b2afd"). InnerVolumeSpecName "kube-api-access-9qs8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.192697 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-util" (OuterVolumeSpecName: "util") pod "d11c6b3d-b42a-4487-8149-216b5b9b2afd" (UID: "d11c6b3d-b42a-4487-8149-216b5b9b2afd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.263490 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.263528 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d11c6b3d-b42a-4487-8149-216b5b9b2afd-util\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.263541 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qs8w\" (UniqueName: \"kubernetes.io/projected/d11c6b3d-b42a-4487-8149-216b5b9b2afd-kube-api-access-9qs8w\") on node \"crc\" DevicePath \"\"" Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.783574 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" event={"ID":"d11c6b3d-b42a-4487-8149-216b5b9b2afd","Type":"ContainerDied","Data":"76aa3310f2fc931d8dfe68abd151f7c02b2bc67d99260b949ec0cebcd99418cd"} Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.783925 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76aa3310f2fc931d8dfe68abd151f7c02b2bc67d99260b949ec0cebcd99418cd" Oct 07 14:05:58 crc kubenswrapper[4717]: I1007 14:05:58.784082 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.111419 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pdn48"] Oct 07 14:06:00 crc kubenswrapper[4717]: E1007 14:06:00.111631 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11c6b3d-b42a-4487-8149-216b5b9b2afd" containerName="extract" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.111642 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11c6b3d-b42a-4487-8149-216b5b9b2afd" containerName="extract" Oct 07 14:06:00 crc kubenswrapper[4717]: E1007 14:06:00.111652 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11c6b3d-b42a-4487-8149-216b5b9b2afd" containerName="pull" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.111658 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11c6b3d-b42a-4487-8149-216b5b9b2afd" containerName="pull" Oct 07 14:06:00 crc kubenswrapper[4717]: E1007 14:06:00.111676 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11c6b3d-b42a-4487-8149-216b5b9b2afd" containerName="util" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.111682 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11c6b3d-b42a-4487-8149-216b5b9b2afd" containerName="util" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.111773 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11c6b3d-b42a-4487-8149-216b5b9b2afd" containerName="extract" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.112170 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pdn48" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.113977 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.114547 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-kkdw4" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.116644 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.129285 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pdn48"] Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.294338 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdm6x\" (UniqueName: \"kubernetes.io/projected/840561ca-9862-4a00-af4e-9e870d51efa4-kube-api-access-fdm6x\") pod \"nmstate-operator-858ddd8f98-pdn48\" (UID: \"840561ca-9862-4a00-af4e-9e870d51efa4\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pdn48" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.396128 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdm6x\" (UniqueName: \"kubernetes.io/projected/840561ca-9862-4a00-af4e-9e870d51efa4-kube-api-access-fdm6x\") pod \"nmstate-operator-858ddd8f98-pdn48\" (UID: \"840561ca-9862-4a00-af4e-9e870d51efa4\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pdn48" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.417443 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdm6x\" (UniqueName: \"kubernetes.io/projected/840561ca-9862-4a00-af4e-9e870d51efa4-kube-api-access-fdm6x\") pod \"nmstate-operator-858ddd8f98-pdn48\" (UID: \"840561ca-9862-4a00-af4e-9e870d51efa4\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-pdn48" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.424796 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pdn48" Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.610709 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-pdn48"] Oct 07 14:06:00 crc kubenswrapper[4717]: I1007 14:06:00.793693 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pdn48" event={"ID":"840561ca-9862-4a00-af4e-9e870d51efa4","Type":"ContainerStarted","Data":"1275be3b75033614f81a2fe119bf263156116ccb4307ef7ddc4fb72c54e7f609"} Oct 07 14:06:01 crc kubenswrapper[4717]: I1007 14:06:01.609408 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:06:01 crc kubenswrapper[4717]: I1007 14:06:01.609467 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.301187 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lpb4"] Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.409377 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq"] Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.409645 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" podUID="7547f514-4e34-4e57-ac67-a4d57b1e7e18" containerName="route-controller-manager" containerID="cri-o://21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f" gracePeriod=30 Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.775045 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.807922 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pdn48" event={"ID":"840561ca-9862-4a00-af4e-9e870d51efa4","Type":"ContainerStarted","Data":"273e57093d69aa2852d8fcb33206fd244e3a48765d1bad02e22a7e1ee4c4310c"} Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.809404 4717 generic.go:334] "Generic (PLEG): container finished" podID="7547f514-4e34-4e57-ac67-a4d57b1e7e18" containerID="21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f" exitCode=0 Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.809626 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" podUID="67e983f2-2f6b-40c2-9690-bf0199c02f04" containerName="controller-manager" containerID="cri-o://7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf" gracePeriod=30 Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.810054 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.810267 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" event={"ID":"7547f514-4e34-4e57-ac67-a4d57b1e7e18","Type":"ContainerDied","Data":"21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f"} Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.810327 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq" event={"ID":"7547f514-4e34-4e57-ac67-a4d57b1e7e18","Type":"ContainerDied","Data":"8c779563afd65e70fcab3772ed3f72bb7fee4186aa7fc6d2cb391c0cc752a807"} Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.810350 4717 scope.go:117] "RemoveContainer" containerID="21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f" Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.824640 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-pdn48" podStartSLOduration=1.724080321 podStartE2EDuration="3.824621207s" podCreationTimestamp="2025-10-07 14:06:00 +0000 UTC" firstStartedPulling="2025-10-07 14:06:00.62617902 +0000 UTC m=+742.454104822" lastFinishedPulling="2025-10-07 14:06:02.726719916 +0000 UTC m=+744.554645708" observedRunningTime="2025-10-07 14:06:03.82113792 +0000 UTC m=+745.649063712" watchObservedRunningTime="2025-10-07 14:06:03.824621207 +0000 UTC m=+745.652546999" Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.831183 4717 scope.go:117] "RemoveContainer" containerID="21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f" Oct 07 14:06:03 crc kubenswrapper[4717]: E1007 14:06:03.831611 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f\": container with ID starting with 21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f not found: ID does not exist" containerID="21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f" Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.831647 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f"} err="failed to get container status \"21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f\": rpc error: code = NotFound desc = could not find container \"21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f\": container with ID starting with 21e99c92a241c26277703561ed50cc2b9f981bf6ae1a0457ec9c21941c10840f not found: ID does not exist" Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.936485 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws7bq\" (UniqueName: \"kubernetes.io/projected/7547f514-4e34-4e57-ac67-a4d57b1e7e18-kube-api-access-ws7bq\") pod \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.936559 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-config\") pod \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.936722 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-client-ca\") pod \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.936778 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7547f514-4e34-4e57-ac67-a4d57b1e7e18-serving-cert\") pod \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\" (UID: \"7547f514-4e34-4e57-ac67-a4d57b1e7e18\") " Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.937203 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-config" (OuterVolumeSpecName: "config") pod "7547f514-4e34-4e57-ac67-a4d57b1e7e18" (UID: "7547f514-4e34-4e57-ac67-a4d57b1e7e18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.937409 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-client-ca" (OuterVolumeSpecName: "client-ca") pod "7547f514-4e34-4e57-ac67-a4d57b1e7e18" (UID: "7547f514-4e34-4e57-ac67-a4d57b1e7e18"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.942399 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7547f514-4e34-4e57-ac67-a4d57b1e7e18-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7547f514-4e34-4e57-ac67-a4d57b1e7e18" (UID: "7547f514-4e34-4e57-ac67-a4d57b1e7e18"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:06:03 crc kubenswrapper[4717]: I1007 14:06:03.942623 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7547f514-4e34-4e57-ac67-a4d57b1e7e18-kube-api-access-ws7bq" (OuterVolumeSpecName: "kube-api-access-ws7bq") pod "7547f514-4e34-4e57-ac67-a4d57b1e7e18" (UID: "7547f514-4e34-4e57-ac67-a4d57b1e7e18"). InnerVolumeSpecName "kube-api-access-ws7bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.038161 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7547f514-4e34-4e57-ac67-a4d57b1e7e18-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.038460 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws7bq\" (UniqueName: \"kubernetes.io/projected/7547f514-4e34-4e57-ac67-a4d57b1e7e18-kube-api-access-ws7bq\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.038474 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.038482 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7547f514-4e34-4e57-ac67-a4d57b1e7e18-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.139945 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.145514 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-6lfbq"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.176477 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.347314 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-config\") pod \"67e983f2-2f6b-40c2-9690-bf0199c02f04\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.347373 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-client-ca\") pod \"67e983f2-2f6b-40c2-9690-bf0199c02f04\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.347403 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpzbs\" (UniqueName: \"kubernetes.io/projected/67e983f2-2f6b-40c2-9690-bf0199c02f04-kube-api-access-zpzbs\") pod \"67e983f2-2f6b-40c2-9690-bf0199c02f04\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.347435 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-proxy-ca-bundles\") pod \"67e983f2-2f6b-40c2-9690-bf0199c02f04\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.347467 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e983f2-2f6b-40c2-9690-bf0199c02f04-serving-cert\") pod \"67e983f2-2f6b-40c2-9690-bf0199c02f04\" (UID: \"67e983f2-2f6b-40c2-9690-bf0199c02f04\") " Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.348146 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-client-ca" (OuterVolumeSpecName: "client-ca") pod "67e983f2-2f6b-40c2-9690-bf0199c02f04" (UID: "67e983f2-2f6b-40c2-9690-bf0199c02f04"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.348177 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "67e983f2-2f6b-40c2-9690-bf0199c02f04" (UID: "67e983f2-2f6b-40c2-9690-bf0199c02f04"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.348567 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-config" (OuterVolumeSpecName: "config") pod "67e983f2-2f6b-40c2-9690-bf0199c02f04" (UID: "67e983f2-2f6b-40c2-9690-bf0199c02f04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.351973 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e983f2-2f6b-40c2-9690-bf0199c02f04-kube-api-access-zpzbs" (OuterVolumeSpecName: "kube-api-access-zpzbs") pod "67e983f2-2f6b-40c2-9690-bf0199c02f04" (UID: "67e983f2-2f6b-40c2-9690-bf0199c02f04"). InnerVolumeSpecName "kube-api-access-zpzbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.352775 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e983f2-2f6b-40c2-9690-bf0199c02f04-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67e983f2-2f6b-40c2-9690-bf0199c02f04" (UID: "67e983f2-2f6b-40c2-9690-bf0199c02f04"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.365312 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-dd6845996-knmfk"] Oct 07 14:06:04 crc kubenswrapper[4717]: E1007 14:06:04.365524 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7547f514-4e34-4e57-ac67-a4d57b1e7e18" containerName="route-controller-manager" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.365581 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7547f514-4e34-4e57-ac67-a4d57b1e7e18" containerName="route-controller-manager" Oct 07 14:06:04 crc kubenswrapper[4717]: E1007 14:06:04.365598 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e983f2-2f6b-40c2-9690-bf0199c02f04" containerName="controller-manager" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.365605 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e983f2-2f6b-40c2-9690-bf0199c02f04" containerName="controller-manager" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.365709 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7547f514-4e34-4e57-ac67-a4d57b1e7e18" containerName="route-controller-manager" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.365720 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e983f2-2f6b-40c2-9690-bf0199c02f04" containerName="controller-manager" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.366077 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.408761 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-dd6845996-knmfk"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.448901 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-proxy-ca-bundles\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.448945 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-config\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.448961 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-client-ca\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.448992 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf64ac76-5b32-4022-9651-a72eeadc3273-serving-cert\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.449100 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z47pz\" (UniqueName: \"kubernetes.io/projected/cf64ac76-5b32-4022-9651-a72eeadc3273-kube-api-access-z47pz\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.449131 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.449142 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e983f2-2f6b-40c2-9690-bf0199c02f04-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.449151 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.449159 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e983f2-2f6b-40c2-9690-bf0199c02f04-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.449167 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpzbs\" (UniqueName: \"kubernetes.io/projected/67e983f2-2f6b-40c2-9690-bf0199c02f04-kube-api-access-zpzbs\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.550295 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf64ac76-5b32-4022-9651-a72eeadc3273-serving-cert\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.550381 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z47pz\" (UniqueName: \"kubernetes.io/projected/cf64ac76-5b32-4022-9651-a72eeadc3273-kube-api-access-z47pz\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.550418 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-proxy-ca-bundles\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.550435 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-config\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.550453 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-client-ca\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.551321 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-client-ca\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.552391 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-proxy-ca-bundles\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.553229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf64ac76-5b32-4022-9651-a72eeadc3273-serving-cert\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.553310 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-config\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.564691 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z47pz\" (UniqueName: \"kubernetes.io/projected/cf64ac76-5b32-4022-9651-a72eeadc3273-kube-api-access-z47pz\") pod \"controller-manager-dd6845996-knmfk\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.680905 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.819655 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dd6845996-knmfk"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.852544 4717 generic.go:334] "Generic (PLEG): container finished" podID="67e983f2-2f6b-40c2-9690-bf0199c02f04" containerID="7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf" exitCode=0 Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.852631 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" event={"ID":"67e983f2-2f6b-40c2-9690-bf0199c02f04","Type":"ContainerDied","Data":"7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf"} Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.852670 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" event={"ID":"67e983f2-2f6b-40c2-9690-bf0199c02f04","Type":"ContainerDied","Data":"3e664b772462a6f778f7f49fa8782fe4655479c23c5b9d2b26ee2f1ed6c5559f"} Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.852696 4717 scope.go:117] "RemoveContainer" containerID="7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.852794 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-9lpb4" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.916109 4717 scope.go:117] "RemoveContainer" containerID="7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf" Oct 07 14:06:04 crc kubenswrapper[4717]: E1007 14:06:04.920353 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf\": container with ID starting with 7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf not found: ID does not exist" containerID="7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.920410 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf"} err="failed to get container status \"7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf\": rpc error: code = NotFound desc = could not find container \"7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf\": container with ID starting with 7d87cde8edb2a8253d404e389d4480882aa13bfd776bb2ad986cf039fde06caf not found: ID does not exist" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.921606 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7547f514-4e34-4e57-ac67-a4d57b1e7e18" path="/var/lib/kubelet/pods/7547f514-4e34-4e57-ac67-a4d57b1e7e18/volumes" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.922271 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.922991 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.923248 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.923366 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.923386 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.923524 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.924182 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-5hlst"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.924357 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.924629 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.926946 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.931353 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.937498 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.937527 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.937681 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.937962 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.937996 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.938121 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.938171 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.941664 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lpb4"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.946483 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-9lpb4"] Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.957706 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-client-ca\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.958141 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-config\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.958356 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvgs5\" (UniqueName: \"kubernetes.io/projected/8083afc6-9711-4408-bd7c-d92138300930-kube-api-access-rvgs5\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.958506 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlv29\" (UniqueName: \"kubernetes.io/projected/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-kube-api-access-wlv29\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.958638 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f70faa25-3fd3-4e7d-bf27-03a163483cb3-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-wlmfn\" (UID: \"f70faa25-3fd3-4e7d-bf27-03a163483cb3\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.958687 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td6bp\" (UniqueName: \"kubernetes.io/projected/df498fdc-f1f6-4fe0-8362-bb061a651a0a-kube-api-access-td6bp\") pod \"nmstate-metrics-fdff9cb8d-rlbn4\" (UID: \"df498fdc-f1f6-4fe0-8362-bb061a651a0a\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.958719 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8083afc6-9711-4408-bd7c-d92138300930-dbus-socket\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.958744 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4qpv\" (UniqueName: \"kubernetes.io/projected/f70faa25-3fd3-4e7d-bf27-03a163483cb3-kube-api-access-w4qpv\") pod \"nmstate-webhook-6cdbc54649-wlmfn\" (UID: \"f70faa25-3fd3-4e7d-bf27-03a163483cb3\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.958764 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-serving-cert\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.958783 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8083afc6-9711-4408-bd7c-d92138300930-nmstate-lock\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.958814 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8083afc6-9711-4408-bd7c-d92138300930-ovs-socket\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:04 crc kubenswrapper[4717]: I1007 14:06:04.996705 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dd6845996-knmfk"] Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.033573 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5"] Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.035725 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.039260 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.039326 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-sclwp" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.039496 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.046901 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5"] Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.064876 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/af496732-9d8f-4872-b431-87ae5dc74691-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-x6gc5\" (UID: \"af496732-9d8f-4872-b431-87ae5dc74691\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.064929 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/af496732-9d8f-4872-b431-87ae5dc74691-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-x6gc5\" (UID: \"af496732-9d8f-4872-b431-87ae5dc74691\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.064953 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvgs5\" (UniqueName: \"kubernetes.io/projected/8083afc6-9711-4408-bd7c-d92138300930-kube-api-access-rvgs5\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.064982 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlv29\" (UniqueName: \"kubernetes.io/projected/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-kube-api-access-wlv29\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065023 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f70faa25-3fd3-4e7d-bf27-03a163483cb3-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-wlmfn\" (UID: \"f70faa25-3fd3-4e7d-bf27-03a163483cb3\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065046 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td6bp\" (UniqueName: \"kubernetes.io/projected/df498fdc-f1f6-4fe0-8362-bb061a651a0a-kube-api-access-td6bp\") pod \"nmstate-metrics-fdff9cb8d-rlbn4\" (UID: \"df498fdc-f1f6-4fe0-8362-bb061a651a0a\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065071 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8083afc6-9711-4408-bd7c-d92138300930-dbus-socket\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065097 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-serving-cert\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065115 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4qpv\" (UniqueName: \"kubernetes.io/projected/f70faa25-3fd3-4e7d-bf27-03a163483cb3-kube-api-access-w4qpv\") pod \"nmstate-webhook-6cdbc54649-wlmfn\" (UID: \"f70faa25-3fd3-4e7d-bf27-03a163483cb3\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065130 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8083afc6-9711-4408-bd7c-d92138300930-nmstate-lock\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69nzz\" (UniqueName: \"kubernetes.io/projected/af496732-9d8f-4872-b431-87ae5dc74691-kube-api-access-69nzz\") pod \"nmstate-console-plugin-6b874cbd85-x6gc5\" (UID: \"af496732-9d8f-4872-b431-87ae5dc74691\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065174 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8083afc6-9711-4408-bd7c-d92138300930-ovs-socket\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065218 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-client-ca\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-config\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:05 crc kubenswrapper[4717]: E1007 14:06:05.065504 4717 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 07 14:06:05 crc kubenswrapper[4717]: E1007 14:06:05.065567 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f70faa25-3fd3-4e7d-bf27-03a163483cb3-tls-key-pair podName:f70faa25-3fd3-4e7d-bf27-03a163483cb3 nodeName:}" failed. No retries permitted until 2025-10-07 14:06:05.565546709 +0000 UTC m=+747.393472501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/f70faa25-3fd3-4e7d-bf27-03a163483cb3-tls-key-pair") pod "nmstate-webhook-6cdbc54649-wlmfn" (UID: "f70faa25-3fd3-4e7d-bf27-03a163483cb3") : secret "openshift-nmstate-webhook" not found Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065769 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8083afc6-9711-4408-bd7c-d92138300930-nmstate-lock\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.065836 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8083afc6-9711-4408-bd7c-d92138300930-ovs-socket\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.066036 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8083afc6-9711-4408-bd7c-d92138300930-dbus-socket\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.066377 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-config\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.066741 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-client-ca\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.070024 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-serving-cert\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.087046 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvgs5\" (UniqueName: \"kubernetes.io/projected/8083afc6-9711-4408-bd7c-d92138300930-kube-api-access-rvgs5\") pod \"nmstate-handler-5hlst\" (UID: \"8083afc6-9711-4408-bd7c-d92138300930\") " pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.092303 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4qpv\" (UniqueName: \"kubernetes.io/projected/f70faa25-3fd3-4e7d-bf27-03a163483cb3-kube-api-access-w4qpv\") pod \"nmstate-webhook-6cdbc54649-wlmfn\" (UID: \"f70faa25-3fd3-4e7d-bf27-03a163483cb3\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.093113 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td6bp\" (UniqueName: \"kubernetes.io/projected/df498fdc-f1f6-4fe0-8362-bb061a651a0a-kube-api-access-td6bp\") pod \"nmstate-metrics-fdff9cb8d-rlbn4\" (UID: \"df498fdc-f1f6-4fe0-8362-bb061a651a0a\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.097683 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlv29\" (UniqueName: \"kubernetes.io/projected/82d4aef3-65eb-4d7f-911b-de1912a0c7dd-kube-api-access-wlv29\") pod \"route-controller-manager-6d86cdf98c-7d66f\" (UID: \"82d4aef3-65eb-4d7f-911b-de1912a0c7dd\") " pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.166462 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/af496732-9d8f-4872-b431-87ae5dc74691-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-x6gc5\" (UID: \"af496732-9d8f-4872-b431-87ae5dc74691\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.166857 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69nzz\" (UniqueName: \"kubernetes.io/projected/af496732-9d8f-4872-b431-87ae5dc74691-kube-api-access-69nzz\") pod \"nmstate-console-plugin-6b874cbd85-x6gc5\" (UID: \"af496732-9d8f-4872-b431-87ae5dc74691\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.166906 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/af496732-9d8f-4872-b431-87ae5dc74691-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-x6gc5\" (UID: \"af496732-9d8f-4872-b431-87ae5dc74691\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.168158 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/af496732-9d8f-4872-b431-87ae5dc74691-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-x6gc5\" (UID: \"af496732-9d8f-4872-b431-87ae5dc74691\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.170808 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/af496732-9d8f-4872-b431-87ae5dc74691-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-x6gc5\" (UID: \"af496732-9d8f-4872-b431-87ae5dc74691\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.185112 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69nzz\" (UniqueName: \"kubernetes.io/projected/af496732-9d8f-4872-b431-87ae5dc74691-kube-api-access-69nzz\") pod \"nmstate-console-plugin-6b874cbd85-x6gc5\" (UID: \"af496732-9d8f-4872-b431-87ae5dc74691\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.242179 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.262132 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.275235 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:05 crc kubenswrapper[4717]: W1007 14:06:05.295749 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8083afc6_9711_4408_bd7c_d92138300930.slice/crio-ba29081114b16a9ca5f0bf8bdb96f81fe3ea1ace34046f85c6f237090d8f03a3 WatchSource:0}: Error finding container ba29081114b16a9ca5f0bf8bdb96f81fe3ea1ace34046f85c6f237090d8f03a3: Status 404 returned error can't find the container with id ba29081114b16a9ca5f0bf8bdb96f81fe3ea1ace34046f85c6f237090d8f03a3 Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.451838 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4"] Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.453408 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.524935 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f"] Oct 07 14:06:05 crc kubenswrapper[4717]: W1007 14:06:05.562808 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82d4aef3_65eb_4d7f_911b_de1912a0c7dd.slice/crio-05fafbd22a4dac3e59caf0912fa5be2661288dcb37b3ddbcc3d96c864a5e2c3b WatchSource:0}: Error finding container 05fafbd22a4dac3e59caf0912fa5be2661288dcb37b3ddbcc3d96c864a5e2c3b: Status 404 returned error can't find the container with id 05fafbd22a4dac3e59caf0912fa5be2661288dcb37b3ddbcc3d96c864a5e2c3b Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.572484 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f70faa25-3fd3-4e7d-bf27-03a163483cb3-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-wlmfn\" (UID: \"f70faa25-3fd3-4e7d-bf27-03a163483cb3\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.576373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f70faa25-3fd3-4e7d-bf27-03a163483cb3-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-wlmfn\" (UID: \"f70faa25-3fd3-4e7d-bf27-03a163483cb3\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.612666 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-55bdb4d455-9slbk"] Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.613454 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.635077 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55bdb4d455-9slbk"] Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.674872 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmgrf\" (UniqueName: \"kubernetes.io/projected/71cfd559-e070-404c-8de4-8c1b2df3a845-kube-api-access-jmgrf\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.674946 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-trusted-ca-bundle\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.674986 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71cfd559-e070-404c-8de4-8c1b2df3a845-console-oauth-config\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.675032 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71cfd559-e070-404c-8de4-8c1b2df3a845-console-serving-cert\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.675084 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-console-config\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.675105 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-oauth-serving-cert\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.675128 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-service-ca\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.694044 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5"] Oct 07 14:06:05 crc kubenswrapper[4717]: W1007 14:06:05.708068 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf496732_9d8f_4872_b431_87ae5dc74691.slice/crio-c38631e4f9e533dfd23c546514dc32af1a0dbcc992fefcb419d8691d5bb0d3bb WatchSource:0}: Error finding container c38631e4f9e533dfd23c546514dc32af1a0dbcc992fefcb419d8691d5bb0d3bb: Status 404 returned error can't find the container with id c38631e4f9e533dfd23c546514dc32af1a0dbcc992fefcb419d8691d5bb0d3bb Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.776508 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-trusted-ca-bundle\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.776565 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71cfd559-e070-404c-8de4-8c1b2df3a845-console-oauth-config\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.776597 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71cfd559-e070-404c-8de4-8c1b2df3a845-console-serving-cert\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.776637 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-console-config\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.776653 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-oauth-serving-cert\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.776674 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-service-ca\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.776695 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmgrf\" (UniqueName: \"kubernetes.io/projected/71cfd559-e070-404c-8de4-8c1b2df3a845-kube-api-access-jmgrf\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.778157 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-trusted-ca-bundle\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.779392 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-console-config\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.779410 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-service-ca\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.780074 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/71cfd559-e070-404c-8de4-8c1b2df3a845-oauth-serving-cert\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.783240 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/71cfd559-e070-404c-8de4-8c1b2df3a845-console-oauth-config\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.788225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/71cfd559-e070-404c-8de4-8c1b2df3a845-console-serving-cert\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.800384 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmgrf\" (UniqueName: \"kubernetes.io/projected/71cfd559-e070-404c-8de4-8c1b2df3a845-kube-api-access-jmgrf\") pod \"console-55bdb4d455-9slbk\" (UID: \"71cfd559-e070-404c-8de4-8c1b2df3a845\") " pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.867914 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.874528 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4" event={"ID":"df498fdc-f1f6-4fe0-8362-bb061a651a0a","Type":"ContainerStarted","Data":"a2836ea675720c49a82e90c5b95aeb8b100c202fcfc766ade56c6e240dbf3bfe"} Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.876434 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" event={"ID":"82d4aef3-65eb-4d7f-911b-de1912a0c7dd","Type":"ContainerStarted","Data":"ec14212cc1c41f33c68b45ee400dd8c080646c404c8962bb0b5b134ca7da1f5c"} Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.876464 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" event={"ID":"82d4aef3-65eb-4d7f-911b-de1912a0c7dd","Type":"ContainerStarted","Data":"05fafbd22a4dac3e59caf0912fa5be2661288dcb37b3ddbcc3d96c864a5e2c3b"} Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.876702 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.878303 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5hlst" event={"ID":"8083afc6-9711-4408-bd7c-d92138300930","Type":"ContainerStarted","Data":"ba29081114b16a9ca5f0bf8bdb96f81fe3ea1ace34046f85c6f237090d8f03a3"} Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.879161 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" event={"ID":"af496732-9d8f-4872-b431-87ae5dc74691","Type":"ContainerStarted","Data":"c38631e4f9e533dfd23c546514dc32af1a0dbcc992fefcb419d8691d5bb0d3bb"} Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.880772 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" event={"ID":"cf64ac76-5b32-4022-9651-a72eeadc3273","Type":"ContainerStarted","Data":"e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163"} Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.880796 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" event={"ID":"cf64ac76-5b32-4022-9651-a72eeadc3273","Type":"ContainerStarted","Data":"8862c998a1344c4adf8960f742e8dec24e76c32ad57251d1d4fd420fc4b5dae2"} Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.880920 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" podUID="cf64ac76-5b32-4022-9651-a72eeadc3273" containerName="controller-manager" containerID="cri-o://e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163" gracePeriod=30 Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.881335 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.888781 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.911862 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" podStartSLOduration=1.911843321 podStartE2EDuration="1.911843321s" podCreationTimestamp="2025-10-07 14:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:06:05.89426653 +0000 UTC m=+747.722192322" watchObservedRunningTime="2025-10-07 14:06:05.911843321 +0000 UTC m=+747.739769113" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.913699 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" podStartSLOduration=2.913691232 podStartE2EDuration="2.913691232s" podCreationTimestamp="2025-10-07 14:06:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:06:05.911145571 +0000 UTC m=+747.739071383" watchObservedRunningTime="2025-10-07 14:06:05.913691232 +0000 UTC m=+747.741617024" Oct 07 14:06:05 crc kubenswrapper[4717]: I1007 14:06:05.941908 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.125670 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn"] Oct 07 14:06:06 crc kubenswrapper[4717]: W1007 14:06:06.129767 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf70faa25_3fd3_4e7d_bf27_03a163483cb3.slice/crio-e4cb1c93d7f61f67f3c06b7035837b867d029bc66d23acc32922ba9c81655fc6 WatchSource:0}: Error finding container e4cb1c93d7f61f67f3c06b7035837b867d029bc66d23acc32922ba9c81655fc6: Status 404 returned error can't find the container with id e4cb1c93d7f61f67f3c06b7035837b867d029bc66d23acc32922ba9c81655fc6 Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.223504 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55bdb4d455-9slbk"] Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.337676 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.366345 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bc977955c-vccgh"] Oct 07 14:06:06 crc kubenswrapper[4717]: E1007 14:06:06.366607 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf64ac76-5b32-4022-9651-a72eeadc3273" containerName="controller-manager" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.366631 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf64ac76-5b32-4022-9651-a72eeadc3273" containerName="controller-manager" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.366775 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf64ac76-5b32-4022-9651-a72eeadc3273" containerName="controller-manager" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.367253 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.382235 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-config\") pod \"cf64ac76-5b32-4022-9651-a72eeadc3273\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.382294 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z47pz\" (UniqueName: \"kubernetes.io/projected/cf64ac76-5b32-4022-9651-a72eeadc3273-kube-api-access-z47pz\") pod \"cf64ac76-5b32-4022-9651-a72eeadc3273\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.382337 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-proxy-ca-bundles\") pod \"cf64ac76-5b32-4022-9651-a72eeadc3273\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.382367 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-client-ca\") pod \"cf64ac76-5b32-4022-9651-a72eeadc3273\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.382458 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf64ac76-5b32-4022-9651-a72eeadc3273-serving-cert\") pod \"cf64ac76-5b32-4022-9651-a72eeadc3273\" (UID: \"cf64ac76-5b32-4022-9651-a72eeadc3273\") " Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.382595 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bef2a237-b10d-4424-8d7f-ca0238d6f57a-proxy-ca-bundles\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.382657 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hswnk\" (UniqueName: \"kubernetes.io/projected/bef2a237-b10d-4424-8d7f-ca0238d6f57a-kube-api-access-hswnk\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.382688 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef2a237-b10d-4424-8d7f-ca0238d6f57a-config\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.382755 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef2a237-b10d-4424-8d7f-ca0238d6f57a-serving-cert\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.382777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bef2a237-b10d-4424-8d7f-ca0238d6f57a-client-ca\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.383204 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-client-ca" (OuterVolumeSpecName: "client-ca") pod "cf64ac76-5b32-4022-9651-a72eeadc3273" (UID: "cf64ac76-5b32-4022-9651-a72eeadc3273"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.383230 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-config" (OuterVolumeSpecName: "config") pod "cf64ac76-5b32-4022-9651-a72eeadc3273" (UID: "cf64ac76-5b32-4022-9651-a72eeadc3273"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.383466 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cf64ac76-5b32-4022-9651-a72eeadc3273" (UID: "cf64ac76-5b32-4022-9651-a72eeadc3273"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.387822 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf64ac76-5b32-4022-9651-a72eeadc3273-kube-api-access-z47pz" (OuterVolumeSpecName: "kube-api-access-z47pz") pod "cf64ac76-5b32-4022-9651-a72eeadc3273" (UID: "cf64ac76-5b32-4022-9651-a72eeadc3273"). InnerVolumeSpecName "kube-api-access-z47pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.398816 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf64ac76-5b32-4022-9651-a72eeadc3273-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cf64ac76-5b32-4022-9651-a72eeadc3273" (UID: "cf64ac76-5b32-4022-9651-a72eeadc3273"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.419653 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bc977955c-vccgh"] Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.482813 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d86cdf98c-7d66f" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.483475 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef2a237-b10d-4424-8d7f-ca0238d6f57a-config\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.483534 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef2a237-b10d-4424-8d7f-ca0238d6f57a-serving-cert\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.483561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bef2a237-b10d-4424-8d7f-ca0238d6f57a-client-ca\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.483596 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bef2a237-b10d-4424-8d7f-ca0238d6f57a-proxy-ca-bundles\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.483663 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hswnk\" (UniqueName: \"kubernetes.io/projected/bef2a237-b10d-4424-8d7f-ca0238d6f57a-kube-api-access-hswnk\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.483723 4717 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf64ac76-5b32-4022-9651-a72eeadc3273-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.483737 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.483751 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z47pz\" (UniqueName: \"kubernetes.io/projected/cf64ac76-5b32-4022-9651-a72eeadc3273-kube-api-access-z47pz\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.483763 4717 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.483775 4717 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf64ac76-5b32-4022-9651-a72eeadc3273-client-ca\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.484992 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bef2a237-b10d-4424-8d7f-ca0238d6f57a-client-ca\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.485141 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bef2a237-b10d-4424-8d7f-ca0238d6f57a-config\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.485277 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bef2a237-b10d-4424-8d7f-ca0238d6f57a-proxy-ca-bundles\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.487851 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bef2a237-b10d-4424-8d7f-ca0238d6f57a-serving-cert\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.501179 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hswnk\" (UniqueName: \"kubernetes.io/projected/bef2a237-b10d-4424-8d7f-ca0238d6f57a-kube-api-access-hswnk\") pod \"controller-manager-bc977955c-vccgh\" (UID: \"bef2a237-b10d-4424-8d7f-ca0238d6f57a\") " pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.683752 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.874698 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e983f2-2f6b-40c2-9690-bf0199c02f04" path="/var/lib/kubelet/pods/67e983f2-2f6b-40c2-9690-bf0199c02f04/volumes" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.885447 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55bdb4d455-9slbk" event={"ID":"71cfd559-e070-404c-8de4-8c1b2df3a845","Type":"ContainerStarted","Data":"5770729cde33dd6240fbaf901243606f60b193a94fd418e5a8b880647c390707"} Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.885488 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55bdb4d455-9slbk" event={"ID":"71cfd559-e070-404c-8de4-8c1b2df3a845","Type":"ContainerStarted","Data":"8803700c5a0e4b5564c7e7d6ecccbbf70075d67b27132379ed6d987967cff858"} Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.886078 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" event={"ID":"f70faa25-3fd3-4e7d-bf27-03a163483cb3","Type":"ContainerStarted","Data":"e4cb1c93d7f61f67f3c06b7035837b867d029bc66d23acc32922ba9c81655fc6"} Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.887255 4717 generic.go:334] "Generic (PLEG): container finished" podID="cf64ac76-5b32-4022-9651-a72eeadc3273" containerID="e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163" exitCode=0 Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.887432 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.887596 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" event={"ID":"cf64ac76-5b32-4022-9651-a72eeadc3273","Type":"ContainerDied","Data":"e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163"} Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.887619 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-dd6845996-knmfk" event={"ID":"cf64ac76-5b32-4022-9651-a72eeadc3273","Type":"ContainerDied","Data":"8862c998a1344c4adf8960f742e8dec24e76c32ad57251d1d4fd420fc4b5dae2"} Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.887634 4717 scope.go:117] "RemoveContainer" containerID="e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.916203 4717 scope.go:117] "RemoveContainer" containerID="e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.917706 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55bdb4d455-9slbk" podStartSLOduration=1.917683853 podStartE2EDuration="1.917683853s" podCreationTimestamp="2025-10-07 14:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:06:06.903696494 +0000 UTC m=+748.731622306" watchObservedRunningTime="2025-10-07 14:06:06.917683853 +0000 UTC m=+748.745609645" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.918894 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-dd6845996-knmfk"] Oct 07 14:06:06 crc kubenswrapper[4717]: E1007 14:06:06.919275 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163\": container with ID starting with e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163 not found: ID does not exist" containerID="e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.919313 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163"} err="failed to get container status \"e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163\": rpc error: code = NotFound desc = could not find container \"e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163\": container with ID starting with e32bcafce95ba1f19ffd4c669ee9e3c72538c3baa12384bf64e95700ec831163 not found: ID does not exist" Oct 07 14:06:06 crc kubenswrapper[4717]: I1007 14:06:06.921779 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-dd6845996-knmfk"] Oct 07 14:06:07 crc kubenswrapper[4717]: I1007 14:06:07.077188 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bc977955c-vccgh"] Oct 07 14:06:07 crc kubenswrapper[4717]: I1007 14:06:07.897797 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" event={"ID":"bef2a237-b10d-4424-8d7f-ca0238d6f57a","Type":"ContainerStarted","Data":"02f2bb111d45cb763b28253ab6968c1682e140a4eff8fb293c20f3ce7e216d87"} Oct 07 14:06:07 crc kubenswrapper[4717]: I1007 14:06:07.898152 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:07 crc kubenswrapper[4717]: I1007 14:06:07.898164 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" event={"ID":"bef2a237-b10d-4424-8d7f-ca0238d6f57a","Type":"ContainerStarted","Data":"f2d70a6e00b39bcc5c42ccc50174c3984d17911be1094e4e16401d90cfc32657"} Oct 07 14:06:07 crc kubenswrapper[4717]: I1007 14:06:07.903443 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" Oct 07 14:06:07 crc kubenswrapper[4717]: I1007 14:06:07.922561 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bc977955c-vccgh" podStartSLOduration=3.922540899 podStartE2EDuration="3.922540899s" podCreationTimestamp="2025-10-07 14:06:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:06:07.915366879 +0000 UTC m=+749.743292691" watchObservedRunningTime="2025-10-07 14:06:07.922540899 +0000 UTC m=+749.750466691" Oct 07 14:06:08 crc kubenswrapper[4717]: I1007 14:06:08.878257 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf64ac76-5b32-4022-9651-a72eeadc3273" path="/var/lib/kubelet/pods/cf64ac76-5b32-4022-9651-a72eeadc3273/volumes" Oct 07 14:06:09 crc kubenswrapper[4717]: I1007 14:06:09.913192 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" event={"ID":"f70faa25-3fd3-4e7d-bf27-03a163483cb3","Type":"ContainerStarted","Data":"a37f2b648cacefcd29b31c6db32eae67fe552c83ace8c6636fc24c07e92fd358"} Oct 07 14:06:09 crc kubenswrapper[4717]: I1007 14:06:09.913554 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:09 crc kubenswrapper[4717]: I1007 14:06:09.915139 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" event={"ID":"af496732-9d8f-4872-b431-87ae5dc74691","Type":"ContainerStarted","Data":"ae08390e9e6508da5f16a4e40195b8ddf7dc022e64ec2618b4be830eee43d77c"} Oct 07 14:06:09 crc kubenswrapper[4717]: I1007 14:06:09.916377 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5hlst" event={"ID":"8083afc6-9711-4408-bd7c-d92138300930","Type":"ContainerStarted","Data":"bdf2602b3b43e8a5ae5e5422dfaf54be79dc2eba5dd1d3c115e055432d6c06a6"} Oct 07 14:06:09 crc kubenswrapper[4717]: I1007 14:06:09.916712 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:09 crc kubenswrapper[4717]: I1007 14:06:09.918854 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4" event={"ID":"df498fdc-f1f6-4fe0-8362-bb061a651a0a","Type":"ContainerStarted","Data":"6d3fb76f9997a6597971d2f7d5515421b14e53490875755d9392dc863fe2eae2"} Oct 07 14:06:09 crc kubenswrapper[4717]: I1007 14:06:09.933436 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" podStartSLOduration=2.535720468 podStartE2EDuration="5.933413903s" podCreationTimestamp="2025-10-07 14:06:04 +0000 UTC" firstStartedPulling="2025-10-07 14:06:06.131835369 +0000 UTC m=+747.959761161" lastFinishedPulling="2025-10-07 14:06:09.529528804 +0000 UTC m=+751.357454596" observedRunningTime="2025-10-07 14:06:09.927366124 +0000 UTC m=+751.755291926" watchObservedRunningTime="2025-10-07 14:06:09.933413903 +0000 UTC m=+751.761339705" Oct 07 14:06:09 crc kubenswrapper[4717]: I1007 14:06:09.944840 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-5hlst" podStartSLOduration=1.7732106829999998 podStartE2EDuration="5.944817691s" podCreationTimestamp="2025-10-07 14:06:04 +0000 UTC" firstStartedPulling="2025-10-07 14:06:05.297390797 +0000 UTC m=+747.125316589" lastFinishedPulling="2025-10-07 14:06:09.468997805 +0000 UTC m=+751.296923597" observedRunningTime="2025-10-07 14:06:09.941276702 +0000 UTC m=+751.769202504" watchObservedRunningTime="2025-10-07 14:06:09.944817691 +0000 UTC m=+751.772743483" Oct 07 14:06:09 crc kubenswrapper[4717]: I1007 14:06:09.954622 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-x6gc5" podStartSLOduration=1.203704254 podStartE2EDuration="4.954602684s" podCreationTimestamp="2025-10-07 14:06:05 +0000 UTC" firstStartedPulling="2025-10-07 14:06:05.709750602 +0000 UTC m=+747.537676394" lastFinishedPulling="2025-10-07 14:06:09.460649032 +0000 UTC m=+751.288574824" observedRunningTime="2025-10-07 14:06:09.953493713 +0000 UTC m=+751.781419505" watchObservedRunningTime="2025-10-07 14:06:09.954602684 +0000 UTC m=+751.782528476" Oct 07 14:06:12 crc kubenswrapper[4717]: I1007 14:06:12.786194 4717 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 07 14:06:12 crc kubenswrapper[4717]: I1007 14:06:12.938567 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4" event={"ID":"df498fdc-f1f6-4fe0-8362-bb061a651a0a","Type":"ContainerStarted","Data":"f863c69fa9620c55a882d1443b465ab91fb38ce4586c6cc5bdea82344b4d3116"} Oct 07 14:06:12 crc kubenswrapper[4717]: I1007 14:06:12.955959 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-rlbn4" podStartSLOduration=2.356443465 podStartE2EDuration="8.955936911s" podCreationTimestamp="2025-10-07 14:06:04 +0000 UTC" firstStartedPulling="2025-10-07 14:06:05.472251546 +0000 UTC m=+747.300177338" lastFinishedPulling="2025-10-07 14:06:12.071744952 +0000 UTC m=+753.899670784" observedRunningTime="2025-10-07 14:06:12.952092574 +0000 UTC m=+754.780018386" watchObservedRunningTime="2025-10-07 14:06:12.955936911 +0000 UTC m=+754.783862733" Oct 07 14:06:15 crc kubenswrapper[4717]: I1007 14:06:15.301492 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-5hlst" Oct 07 14:06:15 crc kubenswrapper[4717]: I1007 14:06:15.942379 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:15 crc kubenswrapper[4717]: I1007 14:06:15.942437 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:15 crc kubenswrapper[4717]: I1007 14:06:15.948951 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:15 crc kubenswrapper[4717]: I1007 14:06:15.959914 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55bdb4d455-9slbk" Oct 07 14:06:16 crc kubenswrapper[4717]: I1007 14:06:16.019043 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-k6dvg"] Oct 07 14:06:25 crc kubenswrapper[4717]: I1007 14:06:25.872700 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-wlmfn" Oct 07 14:06:31 crc kubenswrapper[4717]: I1007 14:06:31.609452 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:06:31 crc kubenswrapper[4717]: I1007 14:06:31.610249 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.560304 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh"] Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.562069 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.564066 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.570971 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh"] Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.650924 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.651110 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7cdl\" (UniqueName: \"kubernetes.io/projected/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-kube-api-access-b7cdl\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.651162 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.751874 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7cdl\" (UniqueName: \"kubernetes.io/projected/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-kube-api-access-b7cdl\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.751938 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.751973 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.752526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.752655 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.770965 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7cdl\" (UniqueName: \"kubernetes.io/projected/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-kube-api-access-b7cdl\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:37 crc kubenswrapper[4717]: I1007 14:06:37.879798 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:38 crc kubenswrapper[4717]: I1007 14:06:38.294187 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh"] Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.102637 4717 generic.go:334] "Generic (PLEG): container finished" podID="0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" containerID="4a1c520abca95515a7b4f912a1370103dac4f2ceddc909129ce6b24059a63b0d" exitCode=0 Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.102686 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" event={"ID":"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0","Type":"ContainerDied","Data":"4a1c520abca95515a7b4f912a1370103dac4f2ceddc909129ce6b24059a63b0d"} Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.102712 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" event={"ID":"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0","Type":"ContainerStarted","Data":"e85fd997ab453f63f4567d5816503d7452a776d90e4b9eaab745a9a4f90e6d1d"} Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.480063 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7pncz"] Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.481856 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.494339 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pncz"] Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.576796 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-utilities\") pod \"redhat-operators-7pncz\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.576901 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lw2d\" (UniqueName: \"kubernetes.io/projected/c886c138-73e5-47f0-9656-bedd84b6257e-kube-api-access-8lw2d\") pod \"redhat-operators-7pncz\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.576922 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-catalog-content\") pod \"redhat-operators-7pncz\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.678667 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lw2d\" (UniqueName: \"kubernetes.io/projected/c886c138-73e5-47f0-9656-bedd84b6257e-kube-api-access-8lw2d\") pod \"redhat-operators-7pncz\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.678738 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-catalog-content\") pod \"redhat-operators-7pncz\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.678862 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-utilities\") pod \"redhat-operators-7pncz\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.679555 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-catalog-content\") pod \"redhat-operators-7pncz\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.679670 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-utilities\") pod \"redhat-operators-7pncz\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.701118 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lw2d\" (UniqueName: \"kubernetes.io/projected/c886c138-73e5-47f0-9656-bedd84b6257e-kube-api-access-8lw2d\") pod \"redhat-operators-7pncz\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:39 crc kubenswrapper[4717]: I1007 14:06:39.812992 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:40 crc kubenswrapper[4717]: I1007 14:06:40.225653 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pncz"] Oct 07 14:06:40 crc kubenswrapper[4717]: W1007 14:06:40.235079 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc886c138_73e5_47f0_9656_bedd84b6257e.slice/crio-525934f3134167713d43ac660802b4166bb7b36655d55701c9b73372d7a2bef0 WatchSource:0}: Error finding container 525934f3134167713d43ac660802b4166bb7b36655d55701c9b73372d7a2bef0: Status 404 returned error can't find the container with id 525934f3134167713d43ac660802b4166bb7b36655d55701c9b73372d7a2bef0 Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.063721 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-k6dvg" podUID="cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" containerName="console" containerID="cri-o://2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c" gracePeriod=15 Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.114423 4717 generic.go:334] "Generic (PLEG): container finished" podID="0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" containerID="d9b80980ee6bdcee69b9b91a95de34f50b2791b2f62824518fea93651ab3d299" exitCode=0 Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.114538 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" event={"ID":"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0","Type":"ContainerDied","Data":"d9b80980ee6bdcee69b9b91a95de34f50b2791b2f62824518fea93651ab3d299"} Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.116019 4717 generic.go:334] "Generic (PLEG): container finished" podID="c886c138-73e5-47f0-9656-bedd84b6257e" containerID="5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292" exitCode=0 Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.116047 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pncz" event={"ID":"c886c138-73e5-47f0-9656-bedd84b6257e","Type":"ContainerDied","Data":"5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292"} Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.116061 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pncz" event={"ID":"c886c138-73e5-47f0-9656-bedd84b6257e","Type":"ContainerStarted","Data":"525934f3134167713d43ac660802b4166bb7b36655d55701c9b73372d7a2bef0"} Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.542609 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-k6dvg_cd44616b-54d3-418a-ac99-9b7ff3c4d2d9/console/0.log" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.542959 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.701190 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-config\") pod \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.701437 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-trusted-ca-bundle\") pod \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.701542 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-oauth-serving-cert\") pod \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.701727 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-service-ca\") pod \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.701873 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6m4t\" (UniqueName: \"kubernetes.io/projected/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-kube-api-access-b6m4t\") pod \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.702790 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-serving-cert\") pod \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.701994 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" (UID: "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.702999 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-oauth-config\") pod \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\" (UID: \"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9\") " Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.702126 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-config" (OuterVolumeSpecName: "console-config") pod "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" (UID: "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.702486 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" (UID: "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.702524 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-service-ca" (OuterVolumeSpecName: "service-ca") pod "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" (UID: "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.703346 4717 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-service-ca\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.703362 4717 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.703372 4717 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.703392 4717 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.707531 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-kube-api-access-b6m4t" (OuterVolumeSpecName: "kube-api-access-b6m4t") pod "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" (UID: "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9"). InnerVolumeSpecName "kube-api-access-b6m4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.707949 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" (UID: "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.708253 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" (UID: "cd44616b-54d3-418a-ac99-9b7ff3c4d2d9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.804299 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6m4t\" (UniqueName: \"kubernetes.io/projected/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-kube-api-access-b6m4t\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.804348 4717 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:41 crc kubenswrapper[4717]: I1007 14:06:41.804367 4717 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.129195 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pncz" event={"ID":"c886c138-73e5-47f0-9656-bedd84b6257e","Type":"ContainerStarted","Data":"fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8"} Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.133440 4717 generic.go:334] "Generic (PLEG): container finished" podID="0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" containerID="c1e6b913306310811819c3cf83ebb91126661cd54c5c40edc8266016b8d4a502" exitCode=0 Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.133561 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" event={"ID":"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0","Type":"ContainerDied","Data":"c1e6b913306310811819c3cf83ebb91126661cd54c5c40edc8266016b8d4a502"} Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.137334 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-k6dvg_cd44616b-54d3-418a-ac99-9b7ff3c4d2d9/console/0.log" Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.137390 4717 generic.go:334] "Generic (PLEG): container finished" podID="cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" containerID="2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c" exitCode=2 Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.137417 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k6dvg" event={"ID":"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9","Type":"ContainerDied","Data":"2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c"} Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.137462 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k6dvg" event={"ID":"cd44616b-54d3-418a-ac99-9b7ff3c4d2d9","Type":"ContainerDied","Data":"0e954b8d57150b807c4c44d7a6baf1dc613cae375c2afb07f720f4c04b5eff8d"} Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.137481 4717 scope.go:117] "RemoveContainer" containerID="2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c" Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.137597 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k6dvg" Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.154752 4717 scope.go:117] "RemoveContainer" containerID="2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c" Oct 07 14:06:42 crc kubenswrapper[4717]: E1007 14:06:42.155302 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c\": container with ID starting with 2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c not found: ID does not exist" containerID="2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c" Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.155337 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c"} err="failed to get container status \"2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c\": rpc error: code = NotFound desc = could not find container \"2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c\": container with ID starting with 2fdc3b6b103e0428bb881f2443fda80d093e7686ebb9251de58f52f435a40f0c not found: ID does not exist" Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.191747 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-k6dvg"] Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.197025 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-k6dvg"] Oct 07 14:06:42 crc kubenswrapper[4717]: I1007 14:06:42.878038 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" path="/var/lib/kubelet/pods/cd44616b-54d3-418a-ac99-9b7ff3c4d2d9/volumes" Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.143462 4717 generic.go:334] "Generic (PLEG): container finished" podID="c886c138-73e5-47f0-9656-bedd84b6257e" containerID="fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8" exitCode=0 Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.143562 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pncz" event={"ID":"c886c138-73e5-47f0-9656-bedd84b6257e","Type":"ContainerDied","Data":"fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8"} Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.474084 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.538216 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-bundle\") pod \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.538366 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7cdl\" (UniqueName: \"kubernetes.io/projected/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-kube-api-access-b7cdl\") pod \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.538448 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-util\") pod \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\" (UID: \"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0\") " Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.539347 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-bundle" (OuterVolumeSpecName: "bundle") pod "0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" (UID: "0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.545928 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-kube-api-access-b7cdl" (OuterVolumeSpecName: "kube-api-access-b7cdl") pod "0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" (UID: "0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0"). InnerVolumeSpecName "kube-api-access-b7cdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.557380 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-util" (OuterVolumeSpecName: "util") pod "0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" (UID: "0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.639761 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-util\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.639806 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:43 crc kubenswrapper[4717]: I1007 14:06:43.639819 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7cdl\" (UniqueName: \"kubernetes.io/projected/0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0-kube-api-access-b7cdl\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:44 crc kubenswrapper[4717]: I1007 14:06:44.156309 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pncz" event={"ID":"c886c138-73e5-47f0-9656-bedd84b6257e","Type":"ContainerStarted","Data":"c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a"} Oct 07 14:06:44 crc kubenswrapper[4717]: I1007 14:06:44.159349 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" event={"ID":"0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0","Type":"ContainerDied","Data":"e85fd997ab453f63f4567d5816503d7452a776d90e4b9eaab745a9a4f90e6d1d"} Oct 07 14:06:44 crc kubenswrapper[4717]: I1007 14:06:44.159396 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e85fd997ab453f63f4567d5816503d7452a776d90e4b9eaab745a9a4f90e6d1d" Oct 07 14:06:44 crc kubenswrapper[4717]: I1007 14:06:44.159464 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh" Oct 07 14:06:44 crc kubenswrapper[4717]: I1007 14:06:44.514539 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7pncz" podStartSLOduration=2.943856261 podStartE2EDuration="5.514521596s" podCreationTimestamp="2025-10-07 14:06:39 +0000 UTC" firstStartedPulling="2025-10-07 14:06:41.117439735 +0000 UTC m=+782.945365527" lastFinishedPulling="2025-10-07 14:06:43.68810507 +0000 UTC m=+785.516030862" observedRunningTime="2025-10-07 14:06:44.185645335 +0000 UTC m=+786.013571127" watchObservedRunningTime="2025-10-07 14:06:44.514521596 +0000 UTC m=+786.342447388" Oct 07 14:06:49 crc kubenswrapper[4717]: I1007 14:06:49.813583 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:49 crc kubenswrapper[4717]: I1007 14:06:49.813984 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:49 crc kubenswrapper[4717]: I1007 14:06:49.854254 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:50 crc kubenswrapper[4717]: I1007 14:06:50.243514 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:51 crc kubenswrapper[4717]: I1007 14:06:51.662521 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pncz"] Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.201206 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7pncz" podUID="c886c138-73e5-47f0-9656-bedd84b6257e" containerName="registry-server" containerID="cri-o://c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a" gracePeriod=2 Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.602396 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.707879 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lw2d\" (UniqueName: \"kubernetes.io/projected/c886c138-73e5-47f0-9656-bedd84b6257e-kube-api-access-8lw2d\") pod \"c886c138-73e5-47f0-9656-bedd84b6257e\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.707987 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-utilities\") pod \"c886c138-73e5-47f0-9656-bedd84b6257e\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.708036 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-catalog-content\") pod \"c886c138-73e5-47f0-9656-bedd84b6257e\" (UID: \"c886c138-73e5-47f0-9656-bedd84b6257e\") " Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.709440 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-utilities" (OuterVolumeSpecName: "utilities") pod "c886c138-73e5-47f0-9656-bedd84b6257e" (UID: "c886c138-73e5-47f0-9656-bedd84b6257e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.713155 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c886c138-73e5-47f0-9656-bedd84b6257e-kube-api-access-8lw2d" (OuterVolumeSpecName: "kube-api-access-8lw2d") pod "c886c138-73e5-47f0-9656-bedd84b6257e" (UID: "c886c138-73e5-47f0-9656-bedd84b6257e"). InnerVolumeSpecName "kube-api-access-8lw2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.809803 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lw2d\" (UniqueName: \"kubernetes.io/projected/c886c138-73e5-47f0-9656-bedd84b6257e-kube-api-access-8lw2d\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.809855 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.911758 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c886c138-73e5-47f0-9656-bedd84b6257e" (UID: "c886c138-73e5-47f0-9656-bedd84b6257e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:06:52 crc kubenswrapper[4717]: I1007 14:06:52.914077 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c886c138-73e5-47f0-9656-bedd84b6257e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.208525 4717 generic.go:334] "Generic (PLEG): container finished" podID="c886c138-73e5-47f0-9656-bedd84b6257e" containerID="c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a" exitCode=0 Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.208565 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pncz" event={"ID":"c886c138-73e5-47f0-9656-bedd84b6257e","Type":"ContainerDied","Data":"c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a"} Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.208596 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pncz" event={"ID":"c886c138-73e5-47f0-9656-bedd84b6257e","Type":"ContainerDied","Data":"525934f3134167713d43ac660802b4166bb7b36655d55701c9b73372d7a2bef0"} Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.208616 4717 scope.go:117] "RemoveContainer" containerID="c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a" Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.208611 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pncz" Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.233630 4717 scope.go:117] "RemoveContainer" containerID="fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8" Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.243684 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pncz"] Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.255440 4717 scope.go:117] "RemoveContainer" containerID="5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292" Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.258795 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7pncz"] Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.277697 4717 scope.go:117] "RemoveContainer" containerID="c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a" Oct 07 14:06:53 crc kubenswrapper[4717]: E1007 14:06:53.278163 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a\": container with ID starting with c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a not found: ID does not exist" containerID="c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a" Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.278202 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a"} err="failed to get container status \"c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a\": rpc error: code = NotFound desc = could not find container \"c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a\": container with ID starting with c3c41679b61b822f4fd49ca0a1d7e43b01a3a7d24cfef5d27c69f965956c841a not found: ID does not exist" Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.278225 4717 scope.go:117] "RemoveContainer" containerID="fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8" Oct 07 14:06:53 crc kubenswrapper[4717]: E1007 14:06:53.278514 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8\": container with ID starting with fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8 not found: ID does not exist" containerID="fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8" Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.278541 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8"} err="failed to get container status \"fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8\": rpc error: code = NotFound desc = could not find container \"fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8\": container with ID starting with fce4fca73e016ed9251d704eccb048863254682d33fa73d80de411b3cd2fd1d8 not found: ID does not exist" Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.278557 4717 scope.go:117] "RemoveContainer" containerID="5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292" Oct 07 14:06:53 crc kubenswrapper[4717]: E1007 14:06:53.278785 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292\": container with ID starting with 5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292 not found: ID does not exist" containerID="5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292" Oct 07 14:06:53 crc kubenswrapper[4717]: I1007 14:06:53.278810 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292"} err="failed to get container status \"5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292\": rpc error: code = NotFound desc = could not find container \"5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292\": container with ID starting with 5f5e38e9cf0ccb8c017094791b5eac594264d6592f5f7b9e2e5b3945ac320292 not found: ID does not exist" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.012901 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44"] Oct 07 14:06:54 crc kubenswrapper[4717]: E1007 14:06:54.013435 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" containerName="util" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.013447 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" containerName="util" Oct 07 14:06:54 crc kubenswrapper[4717]: E1007 14:06:54.013455 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" containerName="console" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.013460 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" containerName="console" Oct 07 14:06:54 crc kubenswrapper[4717]: E1007 14:06:54.013469 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c886c138-73e5-47f0-9656-bedd84b6257e" containerName="extract-content" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.013477 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c886c138-73e5-47f0-9656-bedd84b6257e" containerName="extract-content" Oct 07 14:06:54 crc kubenswrapper[4717]: E1007 14:06:54.013489 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" containerName="pull" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.013494 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" containerName="pull" Oct 07 14:06:54 crc kubenswrapper[4717]: E1007 14:06:54.013506 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" containerName="extract" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.013513 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" containerName="extract" Oct 07 14:06:54 crc kubenswrapper[4717]: E1007 14:06:54.013521 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c886c138-73e5-47f0-9656-bedd84b6257e" containerName="registry-server" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.013526 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c886c138-73e5-47f0-9656-bedd84b6257e" containerName="registry-server" Oct 07 14:06:54 crc kubenswrapper[4717]: E1007 14:06:54.013537 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c886c138-73e5-47f0-9656-bedd84b6257e" containerName="extract-utilities" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.013543 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c886c138-73e5-47f0-9656-bedd84b6257e" containerName="extract-utilities" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.013655 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd44616b-54d3-418a-ac99-9b7ff3c4d2d9" containerName="console" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.013672 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0" containerName="extract" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.013684 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c886c138-73e5-47f0-9656-bedd84b6257e" containerName="registry-server" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.014063 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.015880 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.016170 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.016367 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-dz58d" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.017983 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.019212 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.035710 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44"] Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.160606 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2b39e023-1e34-4d26-8001-ce161b5c0dbd-webhook-cert\") pod \"metallb-operator-controller-manager-7b67b56d8d-7vj44\" (UID: \"2b39e023-1e34-4d26-8001-ce161b5c0dbd\") " pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.160661 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6hn7\" (UniqueName: \"kubernetes.io/projected/2b39e023-1e34-4d26-8001-ce161b5c0dbd-kube-api-access-n6hn7\") pod \"metallb-operator-controller-manager-7b67b56d8d-7vj44\" (UID: \"2b39e023-1e34-4d26-8001-ce161b5c0dbd\") " pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.160863 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2b39e023-1e34-4d26-8001-ce161b5c0dbd-apiservice-cert\") pod \"metallb-operator-controller-manager-7b67b56d8d-7vj44\" (UID: \"2b39e023-1e34-4d26-8001-ce161b5c0dbd\") " pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.262375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2b39e023-1e34-4d26-8001-ce161b5c0dbd-apiservice-cert\") pod \"metallb-operator-controller-manager-7b67b56d8d-7vj44\" (UID: \"2b39e023-1e34-4d26-8001-ce161b5c0dbd\") " pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.262478 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2b39e023-1e34-4d26-8001-ce161b5c0dbd-webhook-cert\") pod \"metallb-operator-controller-manager-7b67b56d8d-7vj44\" (UID: \"2b39e023-1e34-4d26-8001-ce161b5c0dbd\") " pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.262503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6hn7\" (UniqueName: \"kubernetes.io/projected/2b39e023-1e34-4d26-8001-ce161b5c0dbd-kube-api-access-n6hn7\") pod \"metallb-operator-controller-manager-7b67b56d8d-7vj44\" (UID: \"2b39e023-1e34-4d26-8001-ce161b5c0dbd\") " pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.266535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2b39e023-1e34-4d26-8001-ce161b5c0dbd-apiservice-cert\") pod \"metallb-operator-controller-manager-7b67b56d8d-7vj44\" (UID: \"2b39e023-1e34-4d26-8001-ce161b5c0dbd\") " pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.270610 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2b39e023-1e34-4d26-8001-ce161b5c0dbd-webhook-cert\") pod \"metallb-operator-controller-manager-7b67b56d8d-7vj44\" (UID: \"2b39e023-1e34-4d26-8001-ce161b5c0dbd\") " pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.276746 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6hn7\" (UniqueName: \"kubernetes.io/projected/2b39e023-1e34-4d26-8001-ce161b5c0dbd-kube-api-access-n6hn7\") pod \"metallb-operator-controller-manager-7b67b56d8d-7vj44\" (UID: \"2b39e023-1e34-4d26-8001-ce161b5c0dbd\") " pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.329936 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.357126 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz"] Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.357994 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.364226 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.364639 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.367577 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-pgscx" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.373530 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz"] Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.473841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d8c8d26-60f0-4cf2-b139-fc06825e1ed4-apiservice-cert\") pod \"metallb-operator-webhook-server-749b845cbd-g6bgz\" (UID: \"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4\") " pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.475541 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk54m\" (UniqueName: \"kubernetes.io/projected/0d8c8d26-60f0-4cf2-b139-fc06825e1ed4-kube-api-access-gk54m\") pod \"metallb-operator-webhook-server-749b845cbd-g6bgz\" (UID: \"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4\") " pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.475630 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d8c8d26-60f0-4cf2-b139-fc06825e1ed4-webhook-cert\") pod \"metallb-operator-webhook-server-749b845cbd-g6bgz\" (UID: \"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4\") " pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.579582 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d8c8d26-60f0-4cf2-b139-fc06825e1ed4-webhook-cert\") pod \"metallb-operator-webhook-server-749b845cbd-g6bgz\" (UID: \"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4\") " pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.579653 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d8c8d26-60f0-4cf2-b139-fc06825e1ed4-apiservice-cert\") pod \"metallb-operator-webhook-server-749b845cbd-g6bgz\" (UID: \"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4\") " pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.579676 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk54m\" (UniqueName: \"kubernetes.io/projected/0d8c8d26-60f0-4cf2-b139-fc06825e1ed4-kube-api-access-gk54m\") pod \"metallb-operator-webhook-server-749b845cbd-g6bgz\" (UID: \"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4\") " pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.590157 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d8c8d26-60f0-4cf2-b139-fc06825e1ed4-apiservice-cert\") pod \"metallb-operator-webhook-server-749b845cbd-g6bgz\" (UID: \"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4\") " pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.590597 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d8c8d26-60f0-4cf2-b139-fc06825e1ed4-webhook-cert\") pod \"metallb-operator-webhook-server-749b845cbd-g6bgz\" (UID: \"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4\") " pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.610842 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk54m\" (UniqueName: \"kubernetes.io/projected/0d8c8d26-60f0-4cf2-b139-fc06825e1ed4-kube-api-access-gk54m\") pod \"metallb-operator-webhook-server-749b845cbd-g6bgz\" (UID: \"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4\") " pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.715828 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.874663 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c886c138-73e5-47f0-9656-bedd84b6257e" path="/var/lib/kubelet/pods/c886c138-73e5-47f0-9656-bedd84b6257e/volumes" Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.875201 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44"] Oct 07 14:06:54 crc kubenswrapper[4717]: I1007 14:06:54.957241 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz"] Oct 07 14:06:54 crc kubenswrapper[4717]: W1007 14:06:54.966187 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d8c8d26_60f0_4cf2_b139_fc06825e1ed4.slice/crio-cd368321464d08a7fde2d14bcd6c8fea343b4cf8b8b1a81d01af9cb19f9a01ce WatchSource:0}: Error finding container cd368321464d08a7fde2d14bcd6c8fea343b4cf8b8b1a81d01af9cb19f9a01ce: Status 404 returned error can't find the container with id cd368321464d08a7fde2d14bcd6c8fea343b4cf8b8b1a81d01af9cb19f9a01ce Oct 07 14:06:55 crc kubenswrapper[4717]: I1007 14:06:55.221314 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" event={"ID":"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4","Type":"ContainerStarted","Data":"cd368321464d08a7fde2d14bcd6c8fea343b4cf8b8b1a81d01af9cb19f9a01ce"} Oct 07 14:06:55 crc kubenswrapper[4717]: I1007 14:06:55.222366 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" event={"ID":"2b39e023-1e34-4d26-8001-ce161b5c0dbd","Type":"ContainerStarted","Data":"8e7b6b76e25de352d322d6813ab7a53071c10905e58a6e95903b4131390ba42d"} Oct 07 14:06:55 crc kubenswrapper[4717]: I1007 14:06:55.873644 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qdg5f"] Oct 07 14:06:55 crc kubenswrapper[4717]: I1007 14:06:55.874959 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:55 crc kubenswrapper[4717]: I1007 14:06:55.885200 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdg5f"] Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.000192 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjxfz\" (UniqueName: \"kubernetes.io/projected/703c4d91-5ec3-4126-8c70-440f48368d69-kube-api-access-hjxfz\") pod \"community-operators-qdg5f\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.000242 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-catalog-content\") pod \"community-operators-qdg5f\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.000274 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-utilities\") pod \"community-operators-qdg5f\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.101535 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjxfz\" (UniqueName: \"kubernetes.io/projected/703c4d91-5ec3-4126-8c70-440f48368d69-kube-api-access-hjxfz\") pod \"community-operators-qdg5f\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.101590 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-catalog-content\") pod \"community-operators-qdg5f\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.101630 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-utilities\") pod \"community-operators-qdg5f\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.102122 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-utilities\") pod \"community-operators-qdg5f\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.102164 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-catalog-content\") pod \"community-operators-qdg5f\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.127804 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjxfz\" (UniqueName: \"kubernetes.io/projected/703c4d91-5ec3-4126-8c70-440f48368d69-kube-api-access-hjxfz\") pod \"community-operators-qdg5f\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.212173 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:06:56 crc kubenswrapper[4717]: I1007 14:06:56.717887 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qdg5f"] Oct 07 14:06:56 crc kubenswrapper[4717]: W1007 14:06:56.727660 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703c4d91_5ec3_4126_8c70_440f48368d69.slice/crio-9b4b12d4c171e45ce45c4a96f644ca20ddafca18d10abed2cb1cc9b287cd0537 WatchSource:0}: Error finding container 9b4b12d4c171e45ce45c4a96f644ca20ddafca18d10abed2cb1cc9b287cd0537: Status 404 returned error can't find the container with id 9b4b12d4c171e45ce45c4a96f644ca20ddafca18d10abed2cb1cc9b287cd0537 Oct 07 14:06:57 crc kubenswrapper[4717]: I1007 14:06:57.233953 4717 generic.go:334] "Generic (PLEG): container finished" podID="703c4d91-5ec3-4126-8c70-440f48368d69" containerID="a7d595a8d059412ae33f193734e46c4a7325fa43bada1a7a7ede50d35acccd23" exitCode=0 Oct 07 14:06:57 crc kubenswrapper[4717]: I1007 14:06:57.233994 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdg5f" event={"ID":"703c4d91-5ec3-4126-8c70-440f48368d69","Type":"ContainerDied","Data":"a7d595a8d059412ae33f193734e46c4a7325fa43bada1a7a7ede50d35acccd23"} Oct 07 14:06:57 crc kubenswrapper[4717]: I1007 14:06:57.234087 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdg5f" event={"ID":"703c4d91-5ec3-4126-8c70-440f48368d69","Type":"ContainerStarted","Data":"9b4b12d4c171e45ce45c4a96f644ca20ddafca18d10abed2cb1cc9b287cd0537"} Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.252968 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdg5f" event={"ID":"703c4d91-5ec3-4126-8c70-440f48368d69","Type":"ContainerStarted","Data":"c2f07dc62c25ac847b3e5d34c70785f2d34003290b241695c8eaea6b3245ea9e"} Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.254929 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" event={"ID":"2b39e023-1e34-4d26-8001-ce161b5c0dbd","Type":"ContainerStarted","Data":"3c42412a8769d171132f34e7aae14f9bf1b7d18f775c8129f2ac0281c8690785"} Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.255065 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.258304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" event={"ID":"0d8c8d26-60f0-4cf2-b139-fc06825e1ed4","Type":"ContainerStarted","Data":"9b5be0d38a5532f5c380b1f3710cf2bd58950052d5f6647289fb20bd9e844ab4"} Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.258432 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.290704 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" podStartSLOduration=2.37744119 podStartE2EDuration="7.290685232s" podCreationTimestamp="2025-10-07 14:06:53 +0000 UTC" firstStartedPulling="2025-10-07 14:06:54.878817895 +0000 UTC m=+796.706743687" lastFinishedPulling="2025-10-07 14:06:59.792061937 +0000 UTC m=+801.619987729" observedRunningTime="2025-10-07 14:07:00.289577371 +0000 UTC m=+802.117503163" watchObservedRunningTime="2025-10-07 14:07:00.290685232 +0000 UTC m=+802.118611024" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.310159 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" podStartSLOduration=1.491399068 podStartE2EDuration="6.310140134s" podCreationTimestamp="2025-10-07 14:06:54 +0000 UTC" firstStartedPulling="2025-10-07 14:06:54.970467021 +0000 UTC m=+796.798392803" lastFinishedPulling="2025-10-07 14:06:59.789208077 +0000 UTC m=+801.617133869" observedRunningTime="2025-10-07 14:07:00.309790184 +0000 UTC m=+802.137715976" watchObservedRunningTime="2025-10-07 14:07:00.310140134 +0000 UTC m=+802.138065926" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.669035 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qb2xx"] Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.670395 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.679196 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qb2xx"] Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.730919 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-utilities\") pod \"certified-operators-qb2xx\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.731237 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-catalog-content\") pod \"certified-operators-qb2xx\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.731365 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwhxl\" (UniqueName: \"kubernetes.io/projected/8bdf59de-b9e5-43db-be81-364da542c44e-kube-api-access-dwhxl\") pod \"certified-operators-qb2xx\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.831952 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-catalog-content\") pod \"certified-operators-qb2xx\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.832022 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwhxl\" (UniqueName: \"kubernetes.io/projected/8bdf59de-b9e5-43db-be81-364da542c44e-kube-api-access-dwhxl\") pod \"certified-operators-qb2xx\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.832071 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-utilities\") pod \"certified-operators-qb2xx\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.832532 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-catalog-content\") pod \"certified-operators-qb2xx\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.832582 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-utilities\") pod \"certified-operators-qb2xx\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.861095 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwhxl\" (UniqueName: \"kubernetes.io/projected/8bdf59de-b9e5-43db-be81-364da542c44e-kube-api-access-dwhxl\") pod \"certified-operators-qb2xx\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:00 crc kubenswrapper[4717]: I1007 14:07:00.984160 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:01 crc kubenswrapper[4717]: I1007 14:07:01.265706 4717 generic.go:334] "Generic (PLEG): container finished" podID="703c4d91-5ec3-4126-8c70-440f48368d69" containerID="c2f07dc62c25ac847b3e5d34c70785f2d34003290b241695c8eaea6b3245ea9e" exitCode=0 Oct 07 14:07:01 crc kubenswrapper[4717]: I1007 14:07:01.266885 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdg5f" event={"ID":"703c4d91-5ec3-4126-8c70-440f48368d69","Type":"ContainerDied","Data":"c2f07dc62c25ac847b3e5d34c70785f2d34003290b241695c8eaea6b3245ea9e"} Oct 07 14:07:01 crc kubenswrapper[4717]: I1007 14:07:01.506647 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qb2xx"] Oct 07 14:07:01 crc kubenswrapper[4717]: I1007 14:07:01.609729 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:07:01 crc kubenswrapper[4717]: I1007 14:07:01.609792 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:07:01 crc kubenswrapper[4717]: I1007 14:07:01.609835 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:07:01 crc kubenswrapper[4717]: I1007 14:07:01.610442 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7391daae6696cb6ca21332fa8ceb60a843dc3173f8674514bf587cc8767d21d3"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:07:01 crc kubenswrapper[4717]: I1007 14:07:01.610752 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://7391daae6696cb6ca21332fa8ceb60a843dc3173f8674514bf587cc8767d21d3" gracePeriod=600 Oct 07 14:07:02 crc kubenswrapper[4717]: I1007 14:07:02.273547 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="7391daae6696cb6ca21332fa8ceb60a843dc3173f8674514bf587cc8767d21d3" exitCode=0 Oct 07 14:07:02 crc kubenswrapper[4717]: I1007 14:07:02.273618 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"7391daae6696cb6ca21332fa8ceb60a843dc3173f8674514bf587cc8767d21d3"} Oct 07 14:07:02 crc kubenswrapper[4717]: I1007 14:07:02.273966 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"90e7e61d077b49456611059f1c1a8bfe24645f6ad56a34f7d9dbdb19bbcf2fdc"} Oct 07 14:07:02 crc kubenswrapper[4717]: I1007 14:07:02.273985 4717 scope.go:117] "RemoveContainer" containerID="91bfd71de97d3141ca294a4e0d8773e9ff769b86c3768b31cab03dbe97d28d8a" Oct 07 14:07:02 crc kubenswrapper[4717]: I1007 14:07:02.275742 4717 generic.go:334] "Generic (PLEG): container finished" podID="8bdf59de-b9e5-43db-be81-364da542c44e" containerID="7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335" exitCode=0 Oct 07 14:07:02 crc kubenswrapper[4717]: I1007 14:07:02.275803 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qb2xx" event={"ID":"8bdf59de-b9e5-43db-be81-364da542c44e","Type":"ContainerDied","Data":"7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335"} Oct 07 14:07:02 crc kubenswrapper[4717]: I1007 14:07:02.275835 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qb2xx" event={"ID":"8bdf59de-b9e5-43db-be81-364da542c44e","Type":"ContainerStarted","Data":"459e65b21cebbd19601bddecd31c18a08f724d553ba27344b130e41e660fc083"} Oct 07 14:07:02 crc kubenswrapper[4717]: I1007 14:07:02.277661 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdg5f" event={"ID":"703c4d91-5ec3-4126-8c70-440f48368d69","Type":"ContainerStarted","Data":"48872aa5a5a25f0ba602370adefb4658010b827d1347cfd2d5da443e58c9fd84"} Oct 07 14:07:02 crc kubenswrapper[4717]: I1007 14:07:02.313563 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qdg5f" podStartSLOduration=2.904094599 podStartE2EDuration="7.313546901s" podCreationTimestamp="2025-10-07 14:06:55 +0000 UTC" firstStartedPulling="2025-10-07 14:06:57.236689687 +0000 UTC m=+799.064615479" lastFinishedPulling="2025-10-07 14:07:01.646141989 +0000 UTC m=+803.474067781" observedRunningTime="2025-10-07 14:07:02.313373366 +0000 UTC m=+804.141299158" watchObservedRunningTime="2025-10-07 14:07:02.313546901 +0000 UTC m=+804.141472693" Oct 07 14:07:03 crc kubenswrapper[4717]: I1007 14:07:03.286374 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qb2xx" event={"ID":"8bdf59de-b9e5-43db-be81-364da542c44e","Type":"ContainerStarted","Data":"8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094"} Oct 07 14:07:04 crc kubenswrapper[4717]: I1007 14:07:04.292723 4717 generic.go:334] "Generic (PLEG): container finished" podID="8bdf59de-b9e5-43db-be81-364da542c44e" containerID="8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094" exitCode=0 Oct 07 14:07:04 crc kubenswrapper[4717]: I1007 14:07:04.292771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qb2xx" event={"ID":"8bdf59de-b9e5-43db-be81-364da542c44e","Type":"ContainerDied","Data":"8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094"} Oct 07 14:07:06 crc kubenswrapper[4717]: I1007 14:07:06.212345 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:07:06 crc kubenswrapper[4717]: I1007 14:07:06.212645 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:07:06 crc kubenswrapper[4717]: I1007 14:07:06.346313 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:07:06 crc kubenswrapper[4717]: I1007 14:07:06.396659 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:07:07 crc kubenswrapper[4717]: I1007 14:07:07.335346 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qb2xx" event={"ID":"8bdf59de-b9e5-43db-be81-364da542c44e","Type":"ContainerStarted","Data":"1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8"} Oct 07 14:07:07 crc kubenswrapper[4717]: I1007 14:07:07.355525 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qb2xx" podStartSLOduration=3.458243273 podStartE2EDuration="7.355510402s" podCreationTimestamp="2025-10-07 14:07:00 +0000 UTC" firstStartedPulling="2025-10-07 14:07:02.278105843 +0000 UTC m=+804.106031635" lastFinishedPulling="2025-10-07 14:07:06.175372972 +0000 UTC m=+808.003298764" observedRunningTime="2025-10-07 14:07:07.352748345 +0000 UTC m=+809.180674137" watchObservedRunningTime="2025-10-07 14:07:07.355510402 +0000 UTC m=+809.183436194" Oct 07 14:07:09 crc kubenswrapper[4717]: I1007 14:07:09.466628 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdg5f"] Oct 07 14:07:09 crc kubenswrapper[4717]: I1007 14:07:09.467158 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qdg5f" podUID="703c4d91-5ec3-4126-8c70-440f48368d69" containerName="registry-server" containerID="cri-o://48872aa5a5a25f0ba602370adefb4658010b827d1347cfd2d5da443e58c9fd84" gracePeriod=2 Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.356231 4717 generic.go:334] "Generic (PLEG): container finished" podID="703c4d91-5ec3-4126-8c70-440f48368d69" containerID="48872aa5a5a25f0ba602370adefb4658010b827d1347cfd2d5da443e58c9fd84" exitCode=0 Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.356315 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdg5f" event={"ID":"703c4d91-5ec3-4126-8c70-440f48368d69","Type":"ContainerDied","Data":"48872aa5a5a25f0ba602370adefb4658010b827d1347cfd2d5da443e58c9fd84"} Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.523812 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.667859 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-utilities\") pod \"703c4d91-5ec3-4126-8c70-440f48368d69\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.668251 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjxfz\" (UniqueName: \"kubernetes.io/projected/703c4d91-5ec3-4126-8c70-440f48368d69-kube-api-access-hjxfz\") pod \"703c4d91-5ec3-4126-8c70-440f48368d69\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.668371 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-catalog-content\") pod \"703c4d91-5ec3-4126-8c70-440f48368d69\" (UID: \"703c4d91-5ec3-4126-8c70-440f48368d69\") " Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.669016 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-utilities" (OuterVolumeSpecName: "utilities") pod "703c4d91-5ec3-4126-8c70-440f48368d69" (UID: "703c4d91-5ec3-4126-8c70-440f48368d69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.673750 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703c4d91-5ec3-4126-8c70-440f48368d69-kube-api-access-hjxfz" (OuterVolumeSpecName: "kube-api-access-hjxfz") pod "703c4d91-5ec3-4126-8c70-440f48368d69" (UID: "703c4d91-5ec3-4126-8c70-440f48368d69"). InnerVolumeSpecName "kube-api-access-hjxfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.724687 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "703c4d91-5ec3-4126-8c70-440f48368d69" (UID: "703c4d91-5ec3-4126-8c70-440f48368d69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.770068 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.770125 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703c4d91-5ec3-4126-8c70-440f48368d69-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.770143 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjxfz\" (UniqueName: \"kubernetes.io/projected/703c4d91-5ec3-4126-8c70-440f48368d69-kube-api-access-hjxfz\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.984515 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:10 crc kubenswrapper[4717]: I1007 14:07:10.985091 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:11 crc kubenswrapper[4717]: I1007 14:07:11.066889 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:11 crc kubenswrapper[4717]: I1007 14:07:11.363388 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qdg5f" event={"ID":"703c4d91-5ec3-4126-8c70-440f48368d69","Type":"ContainerDied","Data":"9b4b12d4c171e45ce45c4a96f644ca20ddafca18d10abed2cb1cc9b287cd0537"} Oct 07 14:07:11 crc kubenswrapper[4717]: I1007 14:07:11.363824 4717 scope.go:117] "RemoveContainer" containerID="48872aa5a5a25f0ba602370adefb4658010b827d1347cfd2d5da443e58c9fd84" Oct 07 14:07:11 crc kubenswrapper[4717]: I1007 14:07:11.363960 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qdg5f" Oct 07 14:07:11 crc kubenswrapper[4717]: I1007 14:07:11.381237 4717 scope.go:117] "RemoveContainer" containerID="c2f07dc62c25ac847b3e5d34c70785f2d34003290b241695c8eaea6b3245ea9e" Oct 07 14:07:11 crc kubenswrapper[4717]: I1007 14:07:11.389070 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qdg5f"] Oct 07 14:07:11 crc kubenswrapper[4717]: I1007 14:07:11.394198 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qdg5f"] Oct 07 14:07:11 crc kubenswrapper[4717]: I1007 14:07:11.411300 4717 scope.go:117] "RemoveContainer" containerID="a7d595a8d059412ae33f193734e46c4a7325fa43bada1a7a7ede50d35acccd23" Oct 07 14:07:11 crc kubenswrapper[4717]: I1007 14:07:11.415201 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:12 crc kubenswrapper[4717]: I1007 14:07:12.874925 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703c4d91-5ec3-4126-8c70-440f48368d69" path="/var/lib/kubelet/pods/703c4d91-5ec3-4126-8c70-440f48368d69/volumes" Oct 07 14:07:14 crc kubenswrapper[4717]: I1007 14:07:14.661846 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qb2xx"] Oct 07 14:07:14 crc kubenswrapper[4717]: I1007 14:07:14.662561 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qb2xx" podUID="8bdf59de-b9e5-43db-be81-364da542c44e" containerName="registry-server" containerID="cri-o://1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8" gracePeriod=2 Oct 07 14:07:14 crc kubenswrapper[4717]: I1007 14:07:14.723923 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-749b845cbd-g6bgz" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.056837 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.223505 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwhxl\" (UniqueName: \"kubernetes.io/projected/8bdf59de-b9e5-43db-be81-364da542c44e-kube-api-access-dwhxl\") pod \"8bdf59de-b9e5-43db-be81-364da542c44e\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.223953 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-catalog-content\") pod \"8bdf59de-b9e5-43db-be81-364da542c44e\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.223988 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-utilities\") pod \"8bdf59de-b9e5-43db-be81-364da542c44e\" (UID: \"8bdf59de-b9e5-43db-be81-364da542c44e\") " Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.224958 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-utilities" (OuterVolumeSpecName: "utilities") pod "8bdf59de-b9e5-43db-be81-364da542c44e" (UID: "8bdf59de-b9e5-43db-be81-364da542c44e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.229259 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bdf59de-b9e5-43db-be81-364da542c44e-kube-api-access-dwhxl" (OuterVolumeSpecName: "kube-api-access-dwhxl") pod "8bdf59de-b9e5-43db-be81-364da542c44e" (UID: "8bdf59de-b9e5-43db-be81-364da542c44e"). InnerVolumeSpecName "kube-api-access-dwhxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.274957 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bdf59de-b9e5-43db-be81-364da542c44e" (UID: "8bdf59de-b9e5-43db-be81-364da542c44e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.325081 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.325124 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bdf59de-b9e5-43db-be81-364da542c44e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.325138 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwhxl\" (UniqueName: \"kubernetes.io/projected/8bdf59de-b9e5-43db-be81-364da542c44e-kube-api-access-dwhxl\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.396672 4717 generic.go:334] "Generic (PLEG): container finished" podID="8bdf59de-b9e5-43db-be81-364da542c44e" containerID="1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8" exitCode=0 Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.396726 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qb2xx" event={"ID":"8bdf59de-b9e5-43db-be81-364da542c44e","Type":"ContainerDied","Data":"1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8"} Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.396769 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qb2xx" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.396781 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qb2xx" event={"ID":"8bdf59de-b9e5-43db-be81-364da542c44e","Type":"ContainerDied","Data":"459e65b21cebbd19601bddecd31c18a08f724d553ba27344b130e41e660fc083"} Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.396801 4717 scope.go:117] "RemoveContainer" containerID="1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.411932 4717 scope.go:117] "RemoveContainer" containerID="8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.427054 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qb2xx"] Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.431852 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qb2xx"] Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.435844 4717 scope.go:117] "RemoveContainer" containerID="7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.450781 4717 scope.go:117] "RemoveContainer" containerID="1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8" Oct 07 14:07:15 crc kubenswrapper[4717]: E1007 14:07:15.451229 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8\": container with ID starting with 1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8 not found: ID does not exist" containerID="1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.451256 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8"} err="failed to get container status \"1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8\": rpc error: code = NotFound desc = could not find container \"1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8\": container with ID starting with 1da17d4d42884332112a5089b4c3017eef1efc2f8f383d4047617f3199aef1c8 not found: ID does not exist" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.451277 4717 scope.go:117] "RemoveContainer" containerID="8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094" Oct 07 14:07:15 crc kubenswrapper[4717]: E1007 14:07:15.451468 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094\": container with ID starting with 8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094 not found: ID does not exist" containerID="8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.451485 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094"} err="failed to get container status \"8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094\": rpc error: code = NotFound desc = could not find container \"8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094\": container with ID starting with 8ed83c22f2be0d6c37636b0eda7b355fd9235c5b0d773072fb9872a8377ae094 not found: ID does not exist" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.451497 4717 scope.go:117] "RemoveContainer" containerID="7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335" Oct 07 14:07:15 crc kubenswrapper[4717]: E1007 14:07:15.451783 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335\": container with ID starting with 7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335 not found: ID does not exist" containerID="7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335" Oct 07 14:07:15 crc kubenswrapper[4717]: I1007 14:07:15.451808 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335"} err="failed to get container status \"7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335\": rpc error: code = NotFound desc = could not find container \"7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335\": container with ID starting with 7371f407938057769a7cd02e19784369d42472916f97054f418bbc012095f335 not found: ID does not exist" Oct 07 14:07:16 crc kubenswrapper[4717]: I1007 14:07:16.874405 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bdf59de-b9e5-43db-be81-364da542c44e" path="/var/lib/kubelet/pods/8bdf59de-b9e5-43db-be81-364da542c44e/volumes" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.672235 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tr28v"] Oct 07 14:07:24 crc kubenswrapper[4717]: E1007 14:07:24.672888 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdf59de-b9e5-43db-be81-364da542c44e" containerName="extract-utilities" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.672899 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdf59de-b9e5-43db-be81-364da542c44e" containerName="extract-utilities" Oct 07 14:07:24 crc kubenswrapper[4717]: E1007 14:07:24.672912 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703c4d91-5ec3-4126-8c70-440f48368d69" containerName="extract-utilities" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.672918 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="703c4d91-5ec3-4126-8c70-440f48368d69" containerName="extract-utilities" Oct 07 14:07:24 crc kubenswrapper[4717]: E1007 14:07:24.672930 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdf59de-b9e5-43db-be81-364da542c44e" containerName="registry-server" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.672936 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdf59de-b9e5-43db-be81-364da542c44e" containerName="registry-server" Oct 07 14:07:24 crc kubenswrapper[4717]: E1007 14:07:24.672944 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703c4d91-5ec3-4126-8c70-440f48368d69" containerName="registry-server" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.672950 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="703c4d91-5ec3-4126-8c70-440f48368d69" containerName="registry-server" Oct 07 14:07:24 crc kubenswrapper[4717]: E1007 14:07:24.672957 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bdf59de-b9e5-43db-be81-364da542c44e" containerName="extract-content" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.672962 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bdf59de-b9e5-43db-be81-364da542c44e" containerName="extract-content" Oct 07 14:07:24 crc kubenswrapper[4717]: E1007 14:07:24.672973 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703c4d91-5ec3-4126-8c70-440f48368d69" containerName="extract-content" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.672978 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="703c4d91-5ec3-4126-8c70-440f48368d69" containerName="extract-content" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.673103 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bdf59de-b9e5-43db-be81-364da542c44e" containerName="registry-server" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.673120 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="703c4d91-5ec3-4126-8c70-440f48368d69" containerName="registry-server" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.673851 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.688715 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tr28v"] Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.859608 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dmc\" (UniqueName: \"kubernetes.io/projected/7314275a-b4ef-440d-a9c1-333d10e89a48-kube-api-access-w8dmc\") pod \"redhat-marketplace-tr28v\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.859664 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-utilities\") pod \"redhat-marketplace-tr28v\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.859701 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-catalog-content\") pod \"redhat-marketplace-tr28v\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.960366 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dmc\" (UniqueName: \"kubernetes.io/projected/7314275a-b4ef-440d-a9c1-333d10e89a48-kube-api-access-w8dmc\") pod \"redhat-marketplace-tr28v\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.960422 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-utilities\") pod \"redhat-marketplace-tr28v\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.960456 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-catalog-content\") pod \"redhat-marketplace-tr28v\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.960854 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-catalog-content\") pod \"redhat-marketplace-tr28v\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.960911 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-utilities\") pod \"redhat-marketplace-tr28v\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.986739 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dmc\" (UniqueName: \"kubernetes.io/projected/7314275a-b4ef-440d-a9c1-333d10e89a48-kube-api-access-w8dmc\") pod \"redhat-marketplace-tr28v\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:24 crc kubenswrapper[4717]: I1007 14:07:24.993029 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:25 crc kubenswrapper[4717]: I1007 14:07:25.390576 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tr28v"] Oct 07 14:07:25 crc kubenswrapper[4717]: I1007 14:07:25.461314 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tr28v" event={"ID":"7314275a-b4ef-440d-a9c1-333d10e89a48","Type":"ContainerStarted","Data":"ddb5c1e97695c769c6029299bcccc9885d0ab40a590183ca4bf103570ba2dddb"} Oct 07 14:07:26 crc kubenswrapper[4717]: I1007 14:07:26.468744 4717 generic.go:334] "Generic (PLEG): container finished" podID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerID="67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8" exitCode=0 Oct 07 14:07:26 crc kubenswrapper[4717]: I1007 14:07:26.468790 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tr28v" event={"ID":"7314275a-b4ef-440d-a9c1-333d10e89a48","Type":"ContainerDied","Data":"67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8"} Oct 07 14:07:27 crc kubenswrapper[4717]: I1007 14:07:27.476056 4717 generic.go:334] "Generic (PLEG): container finished" podID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerID="7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6" exitCode=0 Oct 07 14:07:27 crc kubenswrapper[4717]: I1007 14:07:27.476147 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tr28v" event={"ID":"7314275a-b4ef-440d-a9c1-333d10e89a48","Type":"ContainerDied","Data":"7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6"} Oct 07 14:07:28 crc kubenswrapper[4717]: I1007 14:07:28.483331 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tr28v" event={"ID":"7314275a-b4ef-440d-a9c1-333d10e89a48","Type":"ContainerStarted","Data":"3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d"} Oct 07 14:07:28 crc kubenswrapper[4717]: I1007 14:07:28.504525 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tr28v" podStartSLOduration=2.877502393 podStartE2EDuration="4.504501574s" podCreationTimestamp="2025-10-07 14:07:24 +0000 UTC" firstStartedPulling="2025-10-07 14:07:26.470776671 +0000 UTC m=+828.298702473" lastFinishedPulling="2025-10-07 14:07:28.097775852 +0000 UTC m=+829.925701654" observedRunningTime="2025-10-07 14:07:28.501667115 +0000 UTC m=+830.329592927" watchObservedRunningTime="2025-10-07 14:07:28.504501574 +0000 UTC m=+830.332427386" Oct 07 14:07:34 crc kubenswrapper[4717]: I1007 14:07:34.332865 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b67b56d8d-7vj44" Oct 07 14:07:34 crc kubenswrapper[4717]: I1007 14:07:34.993586 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:34 crc kubenswrapper[4717]: I1007 14:07:34.993882 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.047522 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.080267 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-b4l79"] Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.083226 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.084714 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn"] Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.085209 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.085312 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.087693 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.089114 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-wlmtk" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.090547 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.096691 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn"] Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.192133 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-frr-startup\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.192218 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-metrics-certs\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.192245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cf4c\" (UniqueName: \"kubernetes.io/projected/c6ca3b03-14b6-45f2-828e-35d06c8455b9-kube-api-access-5cf4c\") pod \"frr-k8s-webhook-server-64bf5d555-t4gtn\" (UID: \"c6ca3b03-14b6-45f2-828e-35d06c8455b9\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.192273 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-frr-conf\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.192296 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-metrics\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.192392 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6ca3b03-14b6-45f2-828e-35d06c8455b9-cert\") pod \"frr-k8s-webhook-server-64bf5d555-t4gtn\" (UID: \"c6ca3b03-14b6-45f2-828e-35d06c8455b9\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.192415 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-reloader\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.192447 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-frr-sockets\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.192464 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c47lg\" (UniqueName: \"kubernetes.io/projected/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-kube-api-access-c47lg\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.198466 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-wdrxn"] Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.199344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.200907 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.201220 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.201335 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-m9v9h" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.202792 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.217149 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-rnp2p"] Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.217962 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.230289 4717 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.244443 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-rnp2p"] Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294171 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-frr-sockets\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294229 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c47lg\" (UniqueName: \"kubernetes.io/projected/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-kube-api-access-c47lg\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294270 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-frr-startup\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294289 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-metrics-certs\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294315 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cf4c\" (UniqueName: \"kubernetes.io/projected/c6ca3b03-14b6-45f2-828e-35d06c8455b9-kube-api-access-5cf4c\") pod \"frr-k8s-webhook-server-64bf5d555-t4gtn\" (UID: \"c6ca3b03-14b6-45f2-828e-35d06c8455b9\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294339 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-frr-conf\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294355 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-metrics\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294371 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6ca3b03-14b6-45f2-828e-35d06c8455b9-cert\") pod \"frr-k8s-webhook-server-64bf5d555-t4gtn\" (UID: \"c6ca3b03-14b6-45f2-828e-35d06c8455b9\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294394 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-reloader\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294790 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-reloader\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.294976 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-frr-sockets\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.295970 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-frr-startup\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: E1007 14:07:35.296056 4717 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 07 14:07:35 crc kubenswrapper[4717]: E1007 14:07:35.296095 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-metrics-certs podName:a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6 nodeName:}" failed. No retries permitted until 2025-10-07 14:07:35.796083055 +0000 UTC m=+837.624008847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-metrics-certs") pod "frr-k8s-b4l79" (UID: "a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6") : secret "frr-k8s-certs-secret" not found Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.296516 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-frr-conf\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.296735 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-metrics\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: E1007 14:07:35.296791 4717 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 07 14:07:35 crc kubenswrapper[4717]: E1007 14:07:35.296814 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6ca3b03-14b6-45f2-828e-35d06c8455b9-cert podName:c6ca3b03-14b6-45f2-828e-35d06c8455b9 nodeName:}" failed. No retries permitted until 2025-10-07 14:07:35.796806845 +0000 UTC m=+837.624732637 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c6ca3b03-14b6-45f2-828e-35d06c8455b9-cert") pod "frr-k8s-webhook-server-64bf5d555-t4gtn" (UID: "c6ca3b03-14b6-45f2-828e-35d06c8455b9") : secret "frr-k8s-webhook-server-cert" not found Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.326938 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cf4c\" (UniqueName: \"kubernetes.io/projected/c6ca3b03-14b6-45f2-828e-35d06c8455b9-kube-api-access-5cf4c\") pod \"frr-k8s-webhook-server-64bf5d555-t4gtn\" (UID: \"c6ca3b03-14b6-45f2-828e-35d06c8455b9\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.338627 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c47lg\" (UniqueName: \"kubernetes.io/projected/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-kube-api-access-c47lg\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.396621 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-metrics-certs\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.396668 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d7d987f-b766-481c-bcd5-ef6ec6e32956-metrics-certs\") pod \"controller-68d546b9d8-rnp2p\" (UID: \"9d7d987f-b766-481c-bcd5-ef6ec6e32956\") " pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.396685 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqh6f\" (UniqueName: \"kubernetes.io/projected/9d7d987f-b766-481c-bcd5-ef6ec6e32956-kube-api-access-xqh6f\") pod \"controller-68d546b9d8-rnp2p\" (UID: \"9d7d987f-b766-481c-bcd5-ef6ec6e32956\") " pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.396710 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-memberlist\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.396732 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/89f792c4-3345-4538-8445-39f6ebfb784c-metallb-excludel2\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.396874 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d7d987f-b766-481c-bcd5-ef6ec6e32956-cert\") pod \"controller-68d546b9d8-rnp2p\" (UID: \"9d7d987f-b766-481c-bcd5-ef6ec6e32956\") " pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.396906 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g5nl\" (UniqueName: \"kubernetes.io/projected/89f792c4-3345-4538-8445-39f6ebfb784c-kube-api-access-8g5nl\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.498245 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g5nl\" (UniqueName: \"kubernetes.io/projected/89f792c4-3345-4538-8445-39f6ebfb784c-kube-api-access-8g5nl\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.498354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-metrics-certs\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.498375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d7d987f-b766-481c-bcd5-ef6ec6e32956-metrics-certs\") pod \"controller-68d546b9d8-rnp2p\" (UID: \"9d7d987f-b766-481c-bcd5-ef6ec6e32956\") " pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.498390 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqh6f\" (UniqueName: \"kubernetes.io/projected/9d7d987f-b766-481c-bcd5-ef6ec6e32956-kube-api-access-xqh6f\") pod \"controller-68d546b9d8-rnp2p\" (UID: \"9d7d987f-b766-481c-bcd5-ef6ec6e32956\") " pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.498419 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-memberlist\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.498455 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/89f792c4-3345-4538-8445-39f6ebfb784c-metallb-excludel2\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.498478 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d7d987f-b766-481c-bcd5-ef6ec6e32956-cert\") pod \"controller-68d546b9d8-rnp2p\" (UID: \"9d7d987f-b766-481c-bcd5-ef6ec6e32956\") " pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:35 crc kubenswrapper[4717]: E1007 14:07:35.498595 4717 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 14:07:35 crc kubenswrapper[4717]: E1007 14:07:35.498660 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-memberlist podName:89f792c4-3345-4538-8445-39f6ebfb784c nodeName:}" failed. No retries permitted until 2025-10-07 14:07:35.998642163 +0000 UTC m=+837.826567955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-memberlist") pod "speaker-wdrxn" (UID: "89f792c4-3345-4538-8445-39f6ebfb784c") : secret "metallb-memberlist" not found Oct 07 14:07:35 crc kubenswrapper[4717]: E1007 14:07:35.499170 4717 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 07 14:07:35 crc kubenswrapper[4717]: E1007 14:07:35.499274 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d7d987f-b766-481c-bcd5-ef6ec6e32956-metrics-certs podName:9d7d987f-b766-481c-bcd5-ef6ec6e32956 nodeName:}" failed. No retries permitted until 2025-10-07 14:07:35.99925169 +0000 UTC m=+837.827177542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9d7d987f-b766-481c-bcd5-ef6ec6e32956-metrics-certs") pod "controller-68d546b9d8-rnp2p" (UID: "9d7d987f-b766-481c-bcd5-ef6ec6e32956") : secret "controller-certs-secret" not found Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.499317 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/89f792c4-3345-4538-8445-39f6ebfb784c-metallb-excludel2\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.501665 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9d7d987f-b766-481c-bcd5-ef6ec6e32956-cert\") pod \"controller-68d546b9d8-rnp2p\" (UID: \"9d7d987f-b766-481c-bcd5-ef6ec6e32956\") " pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.502494 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-metrics-certs\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.516926 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g5nl\" (UniqueName: \"kubernetes.io/projected/89f792c4-3345-4538-8445-39f6ebfb784c-kube-api-access-8g5nl\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.519821 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqh6f\" (UniqueName: \"kubernetes.io/projected/9d7d987f-b766-481c-bcd5-ef6ec6e32956-kube-api-access-xqh6f\") pod \"controller-68d546b9d8-rnp2p\" (UID: \"9d7d987f-b766-481c-bcd5-ef6ec6e32956\") " pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.560291 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.802244 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6ca3b03-14b6-45f2-828e-35d06c8455b9-cert\") pod \"frr-k8s-webhook-server-64bf5d555-t4gtn\" (UID: \"c6ca3b03-14b6-45f2-828e-35d06c8455b9\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.802632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-metrics-certs\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.806221 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6-metrics-certs\") pod \"frr-k8s-b4l79\" (UID: \"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6\") " pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:35 crc kubenswrapper[4717]: I1007 14:07:35.806837 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6ca3b03-14b6-45f2-828e-35d06c8455b9-cert\") pod \"frr-k8s-webhook-server-64bf5d555-t4gtn\" (UID: \"c6ca3b03-14b6-45f2-828e-35d06c8455b9\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:07:36 crc kubenswrapper[4717]: I1007 14:07:36.002822 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:36 crc kubenswrapper[4717]: I1007 14:07:36.005600 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d7d987f-b766-481c-bcd5-ef6ec6e32956-metrics-certs\") pod \"controller-68d546b9d8-rnp2p\" (UID: \"9d7d987f-b766-481c-bcd5-ef6ec6e32956\") " pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:36 crc kubenswrapper[4717]: I1007 14:07:36.005632 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-memberlist\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:36 crc kubenswrapper[4717]: E1007 14:07:36.005747 4717 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 07 14:07:36 crc kubenswrapper[4717]: E1007 14:07:36.005789 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-memberlist podName:89f792c4-3345-4538-8445-39f6ebfb784c nodeName:}" failed. No retries permitted until 2025-10-07 14:07:37.005775086 +0000 UTC m=+838.833700878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-memberlist") pod "speaker-wdrxn" (UID: "89f792c4-3345-4538-8445-39f6ebfb784c") : secret "metallb-memberlist" not found Oct 07 14:07:36 crc kubenswrapper[4717]: I1007 14:07:36.009194 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d7d987f-b766-481c-bcd5-ef6ec6e32956-metrics-certs\") pod \"controller-68d546b9d8-rnp2p\" (UID: \"9d7d987f-b766-481c-bcd5-ef6ec6e32956\") " pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:36 crc kubenswrapper[4717]: I1007 14:07:36.015492 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:07:36 crc kubenswrapper[4717]: I1007 14:07:36.130948 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:36 crc kubenswrapper[4717]: I1007 14:07:36.421939 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn"] Oct 07 14:07:36 crc kubenswrapper[4717]: W1007 14:07:36.428378 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6ca3b03_14b6_45f2_828e_35d06c8455b9.slice/crio-ef6de7f75c37612cf00fb2644452b2cb8cf38910eac6bc79f10a17bfc1cf3149 WatchSource:0}: Error finding container ef6de7f75c37612cf00fb2644452b2cb8cf38910eac6bc79f10a17bfc1cf3149: Status 404 returned error can't find the container with id ef6de7f75c37612cf00fb2644452b2cb8cf38910eac6bc79f10a17bfc1cf3149 Oct 07 14:07:36 crc kubenswrapper[4717]: I1007 14:07:36.521076 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-rnp2p"] Oct 07 14:07:36 crc kubenswrapper[4717]: I1007 14:07:36.524562 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4l79" event={"ID":"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6","Type":"ContainerStarted","Data":"e473a6b441fdcfdda717a46d8bc198d7ffe42b0e674e111190d9213e989d1599"} Oct 07 14:07:36 crc kubenswrapper[4717]: W1007 14:07:36.524561 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d7d987f_b766_481c_bcd5_ef6ec6e32956.slice/crio-0cf3b75c05572c4de75f677babf2706719df8f8b30566f706cdca80ed1e21502 WatchSource:0}: Error finding container 0cf3b75c05572c4de75f677babf2706719df8f8b30566f706cdca80ed1e21502: Status 404 returned error can't find the container with id 0cf3b75c05572c4de75f677babf2706719df8f8b30566f706cdca80ed1e21502 Oct 07 14:07:36 crc kubenswrapper[4717]: I1007 14:07:36.525668 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" event={"ID":"c6ca3b03-14b6-45f2-828e-35d06c8455b9","Type":"ContainerStarted","Data":"ef6de7f75c37612cf00fb2644452b2cb8cf38910eac6bc79f10a17bfc1cf3149"} Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.018522 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-memberlist\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.023885 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/89f792c4-3345-4538-8445-39f6ebfb784c-memberlist\") pod \"speaker-wdrxn\" (UID: \"89f792c4-3345-4538-8445-39f6ebfb784c\") " pod="metallb-system/speaker-wdrxn" Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.265851 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tr28v"] Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.312175 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-wdrxn" Oct 07 14:07:37 crc kubenswrapper[4717]: W1007 14:07:37.365752 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89f792c4_3345_4538_8445_39f6ebfb784c.slice/crio-ff3f212f7db5073683fdb431edca2ce2706c74169158eba2de43030214fdd95c WatchSource:0}: Error finding container ff3f212f7db5073683fdb431edca2ce2706c74169158eba2de43030214fdd95c: Status 404 returned error can't find the container with id ff3f212f7db5073683fdb431edca2ce2706c74169158eba2de43030214fdd95c Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.551967 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wdrxn" event={"ID":"89f792c4-3345-4538-8445-39f6ebfb784c","Type":"ContainerStarted","Data":"ff3f212f7db5073683fdb431edca2ce2706c74169158eba2de43030214fdd95c"} Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.556116 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tr28v" podUID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerName="registry-server" containerID="cri-o://3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d" gracePeriod=2 Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.556838 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rnp2p" event={"ID":"9d7d987f-b766-481c-bcd5-ef6ec6e32956","Type":"ContainerStarted","Data":"8eed40fe8636286b3d5acbed04a38ca9602cf7ff3d5af7e5a0ce8c3ce1520e55"} Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.556880 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.556893 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rnp2p" event={"ID":"9d7d987f-b766-481c-bcd5-ef6ec6e32956","Type":"ContainerStarted","Data":"83e652a5201596c3ee505ce9facfaed4fd6dcf03fd50ccc0708239446f075487"} Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.556904 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-rnp2p" event={"ID":"9d7d987f-b766-481c-bcd5-ef6ec6e32956","Type":"ContainerStarted","Data":"0cf3b75c05572c4de75f677babf2706719df8f8b30566f706cdca80ed1e21502"} Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.585340 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-rnp2p" podStartSLOduration=2.585320322 podStartE2EDuration="2.585320322s" podCreationTimestamp="2025-10-07 14:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:07:37.582175824 +0000 UTC m=+839.410101616" watchObservedRunningTime="2025-10-07 14:07:37.585320322 +0000 UTC m=+839.413246114" Oct 07 14:07:37 crc kubenswrapper[4717]: I1007 14:07:37.988875 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.133509 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8dmc\" (UniqueName: \"kubernetes.io/projected/7314275a-b4ef-440d-a9c1-333d10e89a48-kube-api-access-w8dmc\") pod \"7314275a-b4ef-440d-a9c1-333d10e89a48\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.133635 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-utilities\") pod \"7314275a-b4ef-440d-a9c1-333d10e89a48\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.133671 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-catalog-content\") pod \"7314275a-b4ef-440d-a9c1-333d10e89a48\" (UID: \"7314275a-b4ef-440d-a9c1-333d10e89a48\") " Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.137695 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-utilities" (OuterVolumeSpecName: "utilities") pod "7314275a-b4ef-440d-a9c1-333d10e89a48" (UID: "7314275a-b4ef-440d-a9c1-333d10e89a48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.144202 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7314275a-b4ef-440d-a9c1-333d10e89a48-kube-api-access-w8dmc" (OuterVolumeSpecName: "kube-api-access-w8dmc") pod "7314275a-b4ef-440d-a9c1-333d10e89a48" (UID: "7314275a-b4ef-440d-a9c1-333d10e89a48"). InnerVolumeSpecName "kube-api-access-w8dmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.151511 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7314275a-b4ef-440d-a9c1-333d10e89a48" (UID: "7314275a-b4ef-440d-a9c1-333d10e89a48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.234746 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8dmc\" (UniqueName: \"kubernetes.io/projected/7314275a-b4ef-440d-a9c1-333d10e89a48-kube-api-access-w8dmc\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.234783 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.234792 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7314275a-b4ef-440d-a9c1-333d10e89a48-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.564828 4717 generic.go:334] "Generic (PLEG): container finished" podID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerID="3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d" exitCode=0 Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.565198 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tr28v" event={"ID":"7314275a-b4ef-440d-a9c1-333d10e89a48","Type":"ContainerDied","Data":"3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d"} Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.565242 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tr28v" event={"ID":"7314275a-b4ef-440d-a9c1-333d10e89a48","Type":"ContainerDied","Data":"ddb5c1e97695c769c6029299bcccc9885d0ab40a590183ca4bf103570ba2dddb"} Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.565263 4717 scope.go:117] "RemoveContainer" containerID="3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.565378 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tr28v" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.575489 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wdrxn" event={"ID":"89f792c4-3345-4538-8445-39f6ebfb784c","Type":"ContainerStarted","Data":"b3f184a4126718adf66aea5fc6a67f6215a100cf1d7927d17cfd8236906b25cd"} Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.575541 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-wdrxn" event={"ID":"89f792c4-3345-4538-8445-39f6ebfb784c","Type":"ContainerStarted","Data":"20539de1525aaea746be6cd38082a713b1a3827ff0f95d824805fea0c3c3b878"} Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.594075 4717 scope.go:117] "RemoveContainer" containerID="7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.595907 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-wdrxn" podStartSLOduration=3.595891754 podStartE2EDuration="3.595891754s" podCreationTimestamp="2025-10-07 14:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:07:38.591149081 +0000 UTC m=+840.419074873" watchObservedRunningTime="2025-10-07 14:07:38.595891754 +0000 UTC m=+840.423817546" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.608575 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tr28v"] Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.615613 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tr28v"] Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.627458 4717 scope.go:117] "RemoveContainer" containerID="67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.650407 4717 scope.go:117] "RemoveContainer" containerID="3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d" Oct 07 14:07:38 crc kubenswrapper[4717]: E1007 14:07:38.651475 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d\": container with ID starting with 3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d not found: ID does not exist" containerID="3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.651516 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d"} err="failed to get container status \"3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d\": rpc error: code = NotFound desc = could not find container \"3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d\": container with ID starting with 3c8506d28efe04f27996d58ee9f570f11dc20961ddf52e40eda829cc7fb2372d not found: ID does not exist" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.651544 4717 scope.go:117] "RemoveContainer" containerID="7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6" Oct 07 14:07:38 crc kubenswrapper[4717]: E1007 14:07:38.651935 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6\": container with ID starting with 7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6 not found: ID does not exist" containerID="7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.651979 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6"} err="failed to get container status \"7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6\": rpc error: code = NotFound desc = could not find container \"7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6\": container with ID starting with 7533c7d84d86ce040df54ef90d2a2d959ca79be2507807005034b42ef731d3b6 not found: ID does not exist" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.652036 4717 scope.go:117] "RemoveContainer" containerID="67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8" Oct 07 14:07:38 crc kubenswrapper[4717]: E1007 14:07:38.652309 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8\": container with ID starting with 67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8 not found: ID does not exist" containerID="67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.652340 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8"} err="failed to get container status \"67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8\": rpc error: code = NotFound desc = could not find container \"67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8\": container with ID starting with 67fef932c044d19a7b9663e258fd9ee279554104f8149b74d45e5aa99fd711f8 not found: ID does not exist" Oct 07 14:07:38 crc kubenswrapper[4717]: I1007 14:07:38.886405 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7314275a-b4ef-440d-a9c1-333d10e89a48" path="/var/lib/kubelet/pods/7314275a-b4ef-440d-a9c1-333d10e89a48/volumes" Oct 07 14:07:39 crc kubenswrapper[4717]: I1007 14:07:39.580952 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-wdrxn" Oct 07 14:07:44 crc kubenswrapper[4717]: I1007 14:07:44.619105 4717 generic.go:334] "Generic (PLEG): container finished" podID="a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6" containerID="84dfbedb8d08b0d7bd2825c73c6bffb2b5ef021c2c94f39f4f5ad2b6edd6e0bd" exitCode=0 Oct 07 14:07:44 crc kubenswrapper[4717]: I1007 14:07:44.619225 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4l79" event={"ID":"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6","Type":"ContainerDied","Data":"84dfbedb8d08b0d7bd2825c73c6bffb2b5ef021c2c94f39f4f5ad2b6edd6e0bd"} Oct 07 14:07:44 crc kubenswrapper[4717]: I1007 14:07:44.627168 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" event={"ID":"c6ca3b03-14b6-45f2-828e-35d06c8455b9","Type":"ContainerStarted","Data":"c2abd192ad1788249d312572cfa6796e40053ac6a26482e64a367b6b1f64410e"} Oct 07 14:07:44 crc kubenswrapper[4717]: I1007 14:07:44.627423 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:07:44 crc kubenswrapper[4717]: I1007 14:07:44.664221 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" podStartSLOduration=2.627669746 podStartE2EDuration="9.664161724s" podCreationTimestamp="2025-10-07 14:07:35 +0000 UTC" firstStartedPulling="2025-10-07 14:07:36.429732447 +0000 UTC m=+838.257658239" lastFinishedPulling="2025-10-07 14:07:43.466224425 +0000 UTC m=+845.294150217" observedRunningTime="2025-10-07 14:07:44.66365272 +0000 UTC m=+846.491578512" watchObservedRunningTime="2025-10-07 14:07:44.664161724 +0000 UTC m=+846.492087516" Oct 07 14:07:45 crc kubenswrapper[4717]: I1007 14:07:45.633830 4717 generic.go:334] "Generic (PLEG): container finished" podID="a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6" containerID="1841a4759223e1fb114a84eba7010ca487bc39c957851e9360dab86024e711c9" exitCode=0 Oct 07 14:07:45 crc kubenswrapper[4717]: I1007 14:07:45.633901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4l79" event={"ID":"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6","Type":"ContainerDied","Data":"1841a4759223e1fb114a84eba7010ca487bc39c957851e9360dab86024e711c9"} Oct 07 14:07:46 crc kubenswrapper[4717]: I1007 14:07:46.135289 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-rnp2p" Oct 07 14:07:46 crc kubenswrapper[4717]: I1007 14:07:46.643874 4717 generic.go:334] "Generic (PLEG): container finished" podID="a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6" containerID="9a97a5a4ade76a0cebb0c5288f57e5604662a65d8d11c5880417b5bd5a47b36d" exitCode=0 Oct 07 14:07:46 crc kubenswrapper[4717]: I1007 14:07:46.643917 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4l79" event={"ID":"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6","Type":"ContainerDied","Data":"9a97a5a4ade76a0cebb0c5288f57e5604662a65d8d11c5880417b5bd5a47b36d"} Oct 07 14:07:47 crc kubenswrapper[4717]: I1007 14:07:47.316370 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-wdrxn" Oct 07 14:07:47 crc kubenswrapper[4717]: I1007 14:07:47.653530 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4l79" event={"ID":"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6","Type":"ContainerStarted","Data":"0e6f7ebff89a2ec6eefd73126102cd7cfbfcefc454f608939b52ddde3c2ad236"} Oct 07 14:07:47 crc kubenswrapper[4717]: I1007 14:07:47.653578 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4l79" event={"ID":"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6","Type":"ContainerStarted","Data":"2eb77de25d2ace971999479e9d3c9749cccd6428964bbcc287ad0a843faf1868"} Oct 07 14:07:47 crc kubenswrapper[4717]: I1007 14:07:47.653597 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4l79" event={"ID":"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6","Type":"ContainerStarted","Data":"91a62a9fb41c3fa4fa93d736daa7f66cc8319d36b72e467b34380b064d77c0b8"} Oct 07 14:07:47 crc kubenswrapper[4717]: I1007 14:07:47.653623 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4l79" event={"ID":"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6","Type":"ContainerStarted","Data":"e2843468c3a1569684ecd34e92609b53cedf35233ceb0b5f9f787f9d095234a7"} Oct 07 14:07:47 crc kubenswrapper[4717]: I1007 14:07:47.653636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4l79" event={"ID":"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6","Type":"ContainerStarted","Data":"1c8216b76541848b67842041f19ce81c31403e556c54908169816c6213badf44"} Oct 07 14:07:47 crc kubenswrapper[4717]: I1007 14:07:47.653647 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-b4l79" event={"ID":"a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6","Type":"ContainerStarted","Data":"a05d973c622688d83fe0648f1cc456fa7dbd8ea7fee7b20559c0c2ced2d3df44"} Oct 07 14:07:47 crc kubenswrapper[4717]: I1007 14:07:47.653774 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:47 crc kubenswrapper[4717]: I1007 14:07:47.676135 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-b4l79" podStartSLOduration=5.334782958 podStartE2EDuration="12.676114456s" podCreationTimestamp="2025-10-07 14:07:35 +0000 UTC" firstStartedPulling="2025-10-07 14:07:36.14075755 +0000 UTC m=+837.968683342" lastFinishedPulling="2025-10-07 14:07:43.482089048 +0000 UTC m=+845.310014840" observedRunningTime="2025-10-07 14:07:47.673972657 +0000 UTC m=+849.501898469" watchObservedRunningTime="2025-10-07 14:07:47.676114456 +0000 UTC m=+849.504040248" Oct 07 14:07:51 crc kubenswrapper[4717]: I1007 14:07:51.003287 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:51 crc kubenswrapper[4717]: I1007 14:07:51.067579 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.678550 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-g6w94"] Oct 07 14:07:53 crc kubenswrapper[4717]: E1007 14:07:53.679177 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerName="extract-utilities" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.679192 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerName="extract-utilities" Oct 07 14:07:53 crc kubenswrapper[4717]: E1007 14:07:53.679205 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerName="extract-content" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.679213 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerName="extract-content" Oct 07 14:07:53 crc kubenswrapper[4717]: E1007 14:07:53.679230 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerName="registry-server" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.679237 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerName="registry-server" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.679357 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7314275a-b4ef-440d-a9c1-333d10e89a48" containerName="registry-server" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.679834 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g6w94" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.682034 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-vxtzs" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.682187 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.682302 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.707611 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-g6w94"] Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.789353 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgg7r\" (UniqueName: \"kubernetes.io/projected/a28c8b75-8908-4152-8e0c-2805e594a4b7-kube-api-access-bgg7r\") pod \"openstack-operator-index-g6w94\" (UID: \"a28c8b75-8908-4152-8e0c-2805e594a4b7\") " pod="openstack-operators/openstack-operator-index-g6w94" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.890348 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgg7r\" (UniqueName: \"kubernetes.io/projected/a28c8b75-8908-4152-8e0c-2805e594a4b7-kube-api-access-bgg7r\") pod \"openstack-operator-index-g6w94\" (UID: \"a28c8b75-8908-4152-8e0c-2805e594a4b7\") " pod="openstack-operators/openstack-operator-index-g6w94" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.910102 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgg7r\" (UniqueName: \"kubernetes.io/projected/a28c8b75-8908-4152-8e0c-2805e594a4b7-kube-api-access-bgg7r\") pod \"openstack-operator-index-g6w94\" (UID: \"a28c8b75-8908-4152-8e0c-2805e594a4b7\") " pod="openstack-operators/openstack-operator-index-g6w94" Oct 07 14:07:53 crc kubenswrapper[4717]: I1007 14:07:53.994157 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-g6w94" Oct 07 14:07:54 crc kubenswrapper[4717]: I1007 14:07:54.392074 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-g6w94"] Oct 07 14:07:54 crc kubenswrapper[4717]: I1007 14:07:54.704385 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g6w94" event={"ID":"a28c8b75-8908-4152-8e0c-2805e594a4b7","Type":"ContainerStarted","Data":"1d7415c8ebf3f3b927b87d24cbf44f9bddc7ff5da37efd4cfa7f12c380578861"} Oct 07 14:07:56 crc kubenswrapper[4717]: I1007 14:07:56.006069 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-b4l79" Oct 07 14:07:56 crc kubenswrapper[4717]: I1007 14:07:56.020293 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t4gtn" Oct 07 14:08:02 crc kubenswrapper[4717]: I1007 14:08:02.749602 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-g6w94" event={"ID":"a28c8b75-8908-4152-8e0c-2805e594a4b7","Type":"ContainerStarted","Data":"00870894ad41700abab3f545b1ca65e233a840263bd2d5d6e5a23fb11605b6d1"} Oct 07 14:08:02 crc kubenswrapper[4717]: I1007 14:08:02.764332 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-g6w94" podStartSLOduration=2.293360698 podStartE2EDuration="9.764317771s" podCreationTimestamp="2025-10-07 14:07:53 +0000 UTC" firstStartedPulling="2025-10-07 14:07:54.405260289 +0000 UTC m=+856.233186081" lastFinishedPulling="2025-10-07 14:08:01.876217362 +0000 UTC m=+863.704143154" observedRunningTime="2025-10-07 14:08:02.762701525 +0000 UTC m=+864.590627317" watchObservedRunningTime="2025-10-07 14:08:02.764317771 +0000 UTC m=+864.592243563" Oct 07 14:08:03 crc kubenswrapper[4717]: I1007 14:08:03.994939 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-g6w94" Oct 07 14:08:03 crc kubenswrapper[4717]: I1007 14:08:03.995282 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-g6w94" Oct 07 14:08:04 crc kubenswrapper[4717]: I1007 14:08:04.021282 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-g6w94" Oct 07 14:08:14 crc kubenswrapper[4717]: I1007 14:08:14.019984 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-g6w94" Oct 07 14:08:23 crc kubenswrapper[4717]: I1007 14:08:23.908868 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg"] Oct 07 14:08:23 crc kubenswrapper[4717]: I1007 14:08:23.911574 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:23 crc kubenswrapper[4717]: I1007 14:08:23.913312 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vwxls" Oct 07 14:08:23 crc kubenswrapper[4717]: I1007 14:08:23.917925 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg"] Oct 07 14:08:23 crc kubenswrapper[4717]: I1007 14:08:23.999855 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-util\") pod \"a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.000176 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k82sx\" (UniqueName: \"kubernetes.io/projected/955d7278-00a6-496d-824f-423681b6d873-kube-api-access-k82sx\") pod \"a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.000279 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-bundle\") pod \"a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.101095 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k82sx\" (UniqueName: \"kubernetes.io/projected/955d7278-00a6-496d-824f-423681b6d873-kube-api-access-k82sx\") pod \"a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.101149 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-bundle\") pod \"a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.101184 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-util\") pod \"a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.101653 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-util\") pod \"a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.102026 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-bundle\") pod \"a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.127732 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k82sx\" (UniqueName: \"kubernetes.io/projected/955d7278-00a6-496d-824f-423681b6d873-kube-api-access-k82sx\") pod \"a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.287404 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.485408 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg"] Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.881919 4717 generic.go:334] "Generic (PLEG): container finished" podID="955d7278-00a6-496d-824f-423681b6d873" containerID="b4c9e1913e265db5a93de61c0f5d69ec10dbf95c325ba1b6ca25627e613f3698" exitCode=0 Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.894292 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" event={"ID":"955d7278-00a6-496d-824f-423681b6d873","Type":"ContainerDied","Data":"b4c9e1913e265db5a93de61c0f5d69ec10dbf95c325ba1b6ca25627e613f3698"} Oct 07 14:08:24 crc kubenswrapper[4717]: I1007 14:08:24.894349 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" event={"ID":"955d7278-00a6-496d-824f-423681b6d873","Type":"ContainerStarted","Data":"022affda274d6bb68e5c5f7bec81e58d47cafbabd4def5e7ce6187c98e900a1a"} Oct 07 14:08:25 crc kubenswrapper[4717]: I1007 14:08:25.890678 4717 generic.go:334] "Generic (PLEG): container finished" podID="955d7278-00a6-496d-824f-423681b6d873" containerID="9e539d62ef6e070ccc3847e426dfcc5ffecbbfcb07c388e69a08a4eea89681f0" exitCode=0 Oct 07 14:08:25 crc kubenswrapper[4717]: I1007 14:08:25.890796 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" event={"ID":"955d7278-00a6-496d-824f-423681b6d873","Type":"ContainerDied","Data":"9e539d62ef6e070ccc3847e426dfcc5ffecbbfcb07c388e69a08a4eea89681f0"} Oct 07 14:08:26 crc kubenswrapper[4717]: I1007 14:08:26.899635 4717 generic.go:334] "Generic (PLEG): container finished" podID="955d7278-00a6-496d-824f-423681b6d873" containerID="298dc9b96f0bbcc5f11218344dd0a58404a845a1b54e375890efc5ddda62d3ed" exitCode=0 Oct 07 14:08:26 crc kubenswrapper[4717]: I1007 14:08:26.899680 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" event={"ID":"955d7278-00a6-496d-824f-423681b6d873","Type":"ContainerDied","Data":"298dc9b96f0bbcc5f11218344dd0a58404a845a1b54e375890efc5ddda62d3ed"} Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.141344 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.154179 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-util\") pod \"955d7278-00a6-496d-824f-423681b6d873\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.154271 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k82sx\" (UniqueName: \"kubernetes.io/projected/955d7278-00a6-496d-824f-423681b6d873-kube-api-access-k82sx\") pod \"955d7278-00a6-496d-824f-423681b6d873\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.154304 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-bundle\") pod \"955d7278-00a6-496d-824f-423681b6d873\" (UID: \"955d7278-00a6-496d-824f-423681b6d873\") " Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.156492 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-bundle" (OuterVolumeSpecName: "bundle") pod "955d7278-00a6-496d-824f-423681b6d873" (UID: "955d7278-00a6-496d-824f-423681b6d873"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.161580 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955d7278-00a6-496d-824f-423681b6d873-kube-api-access-k82sx" (OuterVolumeSpecName: "kube-api-access-k82sx") pod "955d7278-00a6-496d-824f-423681b6d873" (UID: "955d7278-00a6-496d-824f-423681b6d873"). InnerVolumeSpecName "kube-api-access-k82sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.172124 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-util" (OuterVolumeSpecName: "util") pod "955d7278-00a6-496d-824f-423681b6d873" (UID: "955d7278-00a6-496d-824f-423681b6d873"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.256955 4717 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-util\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.257025 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k82sx\" (UniqueName: \"kubernetes.io/projected/955d7278-00a6-496d-824f-423681b6d873-kube-api-access-k82sx\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.257037 4717 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/955d7278-00a6-496d-824f-423681b6d873-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.919305 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" event={"ID":"955d7278-00a6-496d-824f-423681b6d873","Type":"ContainerDied","Data":"022affda274d6bb68e5c5f7bec81e58d47cafbabd4def5e7ce6187c98e900a1a"} Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.919368 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="022affda274d6bb68e5c5f7bec81e58d47cafbabd4def5e7ce6187c98e900a1a" Oct 07 14:08:28 crc kubenswrapper[4717]: I1007 14:08:28.919486 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.135460 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp"] Oct 07 14:08:34 crc kubenswrapper[4717]: E1007 14:08:34.136177 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955d7278-00a6-496d-824f-423681b6d873" containerName="pull" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.136191 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="955d7278-00a6-496d-824f-423681b6d873" containerName="pull" Oct 07 14:08:34 crc kubenswrapper[4717]: E1007 14:08:34.136213 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955d7278-00a6-496d-824f-423681b6d873" containerName="extract" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.136219 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="955d7278-00a6-496d-824f-423681b6d873" containerName="extract" Oct 07 14:08:34 crc kubenswrapper[4717]: E1007 14:08:34.136230 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="955d7278-00a6-496d-824f-423681b6d873" containerName="util" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.136236 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="955d7278-00a6-496d-824f-423681b6d873" containerName="util" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.136333 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="955d7278-00a6-496d-824f-423681b6d873" containerName="extract" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.152541 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.155118 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-rzmrr" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.163743 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp"] Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.224951 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df5wr\" (UniqueName: \"kubernetes.io/projected/da883109-9f27-447d-aa39-aa7dcf80f19f-kube-api-access-df5wr\") pod \"openstack-operator-controller-operator-5fc947cc4b-xh5qp\" (UID: \"da883109-9f27-447d-aa39-aa7dcf80f19f\") " pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.326595 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-df5wr\" (UniqueName: \"kubernetes.io/projected/da883109-9f27-447d-aa39-aa7dcf80f19f-kube-api-access-df5wr\") pod \"openstack-operator-controller-operator-5fc947cc4b-xh5qp\" (UID: \"da883109-9f27-447d-aa39-aa7dcf80f19f\") " pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.353395 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-df5wr\" (UniqueName: \"kubernetes.io/projected/da883109-9f27-447d-aa39-aa7dcf80f19f-kube-api-access-df5wr\") pod \"openstack-operator-controller-operator-5fc947cc4b-xh5qp\" (UID: \"da883109-9f27-447d-aa39-aa7dcf80f19f\") " pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" Oct 07 14:08:34 crc kubenswrapper[4717]: I1007 14:08:34.478286 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" Oct 07 14:08:35 crc kubenswrapper[4717]: I1007 14:08:35.033081 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp"] Oct 07 14:08:35 crc kubenswrapper[4717]: I1007 14:08:35.955787 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" event={"ID":"da883109-9f27-447d-aa39-aa7dcf80f19f","Type":"ContainerStarted","Data":"fe726f8b0e6e991f265cc98e606fbb461ce56ca14a0291016f640d19ec2c4ae2"} Oct 07 14:08:38 crc kubenswrapper[4717]: I1007 14:08:38.973069 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" event={"ID":"da883109-9f27-447d-aa39-aa7dcf80f19f","Type":"ContainerStarted","Data":"68612a57f80da0642bd131f2e8073d1937c4f32e3e4c7abe13efcfe7f953f2aa"} Oct 07 14:08:40 crc kubenswrapper[4717]: I1007 14:08:40.985910 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" event={"ID":"da883109-9f27-447d-aa39-aa7dcf80f19f","Type":"ContainerStarted","Data":"7412ce33581cc28ad6811cc95c2937395c6339533b797d283cecbcf3e922dfc7"} Oct 07 14:08:40 crc kubenswrapper[4717]: I1007 14:08:40.986430 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" Oct 07 14:08:44 crc kubenswrapper[4717]: I1007 14:08:44.480769 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" Oct 07 14:08:44 crc kubenswrapper[4717]: I1007 14:08:44.509444 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5fc947cc4b-xh5qp" podStartSLOduration=4.731750697 podStartE2EDuration="10.509425004s" podCreationTimestamp="2025-10-07 14:08:34 +0000 UTC" firstStartedPulling="2025-10-07 14:08:35.046796315 +0000 UTC m=+896.874722107" lastFinishedPulling="2025-10-07 14:08:40.824470622 +0000 UTC m=+902.652396414" observedRunningTime="2025-10-07 14:08:41.015271023 +0000 UTC m=+902.843196815" watchObservedRunningTime="2025-10-07 14:08:44.509425004 +0000 UTC m=+906.337350796" Oct 07 14:09:01 crc kubenswrapper[4717]: I1007 14:09:01.609848 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:09:01 crc kubenswrapper[4717]: I1007 14:09:01.610301 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.486799 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.488528 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.490741 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-q955g" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.493727 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.495625 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.499907 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-hgzn7" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.507243 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.532727 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.534227 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.536757 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-z6cv9" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.538120 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.539384 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.542093 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-5txl6" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.550445 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.560501 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.575158 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.589890 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.591367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.598283 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jfz8f" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.606783 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.607745 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.611383 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-z488r" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.615138 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfhnl\" (UniqueName: \"kubernetes.io/projected/cfd500c0-e80e-4553-affd-c3b5437d67b7-kube-api-access-wfhnl\") pod \"heat-operator-controller-manager-54b4974c45-8zlpx\" (UID: \"cfd500c0-e80e-4553-affd-c3b5437d67b7\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.615195 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77m96\" (UniqueName: \"kubernetes.io/projected/c56f0658-961c-4727-8ba8-5d24af8523dd-kube-api-access-77m96\") pod \"cinder-operator-controller-manager-7d4d4f8d-sd746\" (UID: \"c56f0658-961c-4727-8ba8-5d24af8523dd\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.615229 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbmcv\" (UniqueName: \"kubernetes.io/projected/303da62c-428b-4494-9ad6-4168652dcd4e-kube-api-access-hbmcv\") pod \"barbican-operator-controller-manager-58c4cd55f4-xd2vv\" (UID: \"303da62c-428b-4494-9ad6-4168652dcd4e\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.615249 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jflwz\" (UniqueName: \"kubernetes.io/projected/ee9e55bf-132a-44ef-82ca-0e1ac422afd3-kube-api-access-jflwz\") pod \"designate-operator-controller-manager-75dfd9b554-pfxp7\" (UID: \"ee9e55bf-132a-44ef-82ca-0e1ac422afd3\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.615270 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4hh9\" (UniqueName: \"kubernetes.io/projected/0aa98a33-f3d0-49ee-8909-a88e320aa26c-kube-api-access-z4hh9\") pod \"glance-operator-controller-manager-5dc44df7d5-8xb4k\" (UID: \"0aa98a33-f3d0-49ee-8909-a88e320aa26c\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.615287 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgnj6\" (UniqueName: \"kubernetes.io/projected/acb68432-c913-4c77-bfc8-ef15f9e74a1c-kube-api-access-lgnj6\") pod \"horizon-operator-controller-manager-76d5b87f47-q8qkj\" (UID: \"acb68432-c913-4c77-bfc8-ef15f9e74a1c\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.623129 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.630435 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.645467 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.667823 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.674963 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.676105 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-42xx9" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.689496 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.713981 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.725258 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-qxxd9" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.726110 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.727320 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77m96\" (UniqueName: \"kubernetes.io/projected/c56f0658-961c-4727-8ba8-5d24af8523dd-kube-api-access-77m96\") pod \"cinder-operator-controller-manager-7d4d4f8d-sd746\" (UID: \"c56f0658-961c-4727-8ba8-5d24af8523dd\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.727399 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44cw\" (UniqueName: \"kubernetes.io/projected/b4fabdda-208b-48e9-b4d9-85e638c74ad4-kube-api-access-g44cw\") pod \"ironic-operator-controller-manager-649675d675-jdbmx\" (UID: \"b4fabdda-208b-48e9-b4d9-85e638c74ad4\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.727488 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbmcv\" (UniqueName: \"kubernetes.io/projected/303da62c-428b-4494-9ad6-4168652dcd4e-kube-api-access-hbmcv\") pod \"barbican-operator-controller-manager-58c4cd55f4-xd2vv\" (UID: \"303da62c-428b-4494-9ad6-4168652dcd4e\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.727528 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jflwz\" (UniqueName: \"kubernetes.io/projected/ee9e55bf-132a-44ef-82ca-0e1ac422afd3-kube-api-access-jflwz\") pod \"designate-operator-controller-manager-75dfd9b554-pfxp7\" (UID: \"ee9e55bf-132a-44ef-82ca-0e1ac422afd3\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.727570 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4hh9\" (UniqueName: \"kubernetes.io/projected/0aa98a33-f3d0-49ee-8909-a88e320aa26c-kube-api-access-z4hh9\") pod \"glance-operator-controller-manager-5dc44df7d5-8xb4k\" (UID: \"0aa98a33-f3d0-49ee-8909-a88e320aa26c\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.727598 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgnj6\" (UniqueName: \"kubernetes.io/projected/acb68432-c913-4c77-bfc8-ef15f9e74a1c-kube-api-access-lgnj6\") pod \"horizon-operator-controller-manager-76d5b87f47-q8qkj\" (UID: \"acb68432-c913-4c77-bfc8-ef15f9e74a1c\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.727705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfhnl\" (UniqueName: \"kubernetes.io/projected/cfd500c0-e80e-4553-affd-c3b5437d67b7-kube-api-access-wfhnl\") pod \"heat-operator-controller-manager-54b4974c45-8zlpx\" (UID: \"cfd500c0-e80e-4553-affd-c3b5437d67b7\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.737188 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.763178 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.765289 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.773837 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgnj6\" (UniqueName: \"kubernetes.io/projected/acb68432-c913-4c77-bfc8-ef15f9e74a1c-kube-api-access-lgnj6\") pod \"horizon-operator-controller-manager-76d5b87f47-q8qkj\" (UID: \"acb68432-c913-4c77-bfc8-ef15f9e74a1c\") " pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.775389 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77m96\" (UniqueName: \"kubernetes.io/projected/c56f0658-961c-4727-8ba8-5d24af8523dd-kube-api-access-77m96\") pod \"cinder-operator-controller-manager-7d4d4f8d-sd746\" (UID: \"c56f0658-961c-4727-8ba8-5d24af8523dd\") " pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.783470 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbmcv\" (UniqueName: \"kubernetes.io/projected/303da62c-428b-4494-9ad6-4168652dcd4e-kube-api-access-hbmcv\") pod \"barbican-operator-controller-manager-58c4cd55f4-xd2vv\" (UID: \"303da62c-428b-4494-9ad6-4168652dcd4e\") " pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.783554 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.785604 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.786440 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-qxw8p" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.787490 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.794987 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-sxvbl" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.803128 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.804517 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.809707 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.815346 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.819648 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-nfghz" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.825331 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.834158 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g44cw\" (UniqueName: \"kubernetes.io/projected/b4fabdda-208b-48e9-b4d9-85e638c74ad4-kube-api-access-g44cw\") pod \"ironic-operator-controller-manager-649675d675-jdbmx\" (UID: \"b4fabdda-208b-48e9-b4d9-85e638c74ad4\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.834218 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47df3c50-6937-4479-8463-4d6816e354d4-cert\") pod \"infra-operator-controller-manager-658588b8c9-b4l9q\" (UID: \"47df3c50-6937-4479-8463-4d6816e354d4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.834265 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jd2j\" (UniqueName: \"kubernetes.io/projected/47df3c50-6937-4479-8463-4d6816e354d4-kube-api-access-6jd2j\") pod \"infra-operator-controller-manager-658588b8c9-b4l9q\" (UID: \"47df3c50-6937-4479-8463-4d6816e354d4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.834314 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmqkd\" (UniqueName: \"kubernetes.io/projected/d7327559-8f4c-4745-acde-51a8ec9ca67a-kube-api-access-kmqkd\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-8xkj4\" (UID: \"d7327559-8f4c-4745-acde-51a8ec9ca67a\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.834350 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htw2r\" (UniqueName: \"kubernetes.io/projected/763bbe43-a00e-4a82-b90c-bd56ab6a516a-kube-api-access-htw2r\") pod \"manila-operator-controller-manager-65d89cfd9f-7stm5\" (UID: \"763bbe43-a00e-4a82-b90c-bd56ab6a516a\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.834404 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/d0d37d45-9608-4b1d-97b0-62f8b36ab834-kube-api-access-jjcbz\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-znbv7\" (UID: \"d0d37d45-9608-4b1d-97b0-62f8b36ab834\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.834818 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4hh9\" (UniqueName: \"kubernetes.io/projected/0aa98a33-f3d0-49ee-8909-a88e320aa26c-kube-api-access-z4hh9\") pod \"glance-operator-controller-manager-5dc44df7d5-8xb4k\" (UID: \"0aa98a33-f3d0-49ee-8909-a88e320aa26c\") " pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.835569 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jflwz\" (UniqueName: \"kubernetes.io/projected/ee9e55bf-132a-44ef-82ca-0e1ac422afd3-kube-api-access-jflwz\") pod \"designate-operator-controller-manager-75dfd9b554-pfxp7\" (UID: \"ee9e55bf-132a-44ef-82ca-0e1ac422afd3\") " pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.835737 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfhnl\" (UniqueName: \"kubernetes.io/projected/cfd500c0-e80e-4553-affd-c3b5437d67b7-kube-api-access-wfhnl\") pod \"heat-operator-controller-manager-54b4974c45-8zlpx\" (UID: \"cfd500c0-e80e-4553-affd-c3b5437d67b7\") " pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.865516 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.867101 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.870450 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.892958 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.894126 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.903226 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-5cqnt" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.903870 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44cw\" (UniqueName: \"kubernetes.io/projected/b4fabdda-208b-48e9-b4d9-85e638c74ad4-kube-api-access-g44cw\") pod \"ironic-operator-controller-manager-649675d675-jdbmx\" (UID: \"b4fabdda-208b-48e9-b4d9-85e638c74ad4\") " pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.915716 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.919855 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.920818 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.927282 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-27nlz" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.932422 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.935491 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htw2r\" (UniqueName: \"kubernetes.io/projected/763bbe43-a00e-4a82-b90c-bd56ab6a516a-kube-api-access-htw2r\") pod \"manila-operator-controller-manager-65d89cfd9f-7stm5\" (UID: \"763bbe43-a00e-4a82-b90c-bd56ab6a516a\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.935539 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/d0d37d45-9608-4b1d-97b0-62f8b36ab834-kube-api-access-jjcbz\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-znbv7\" (UID: \"d0d37d45-9608-4b1d-97b0-62f8b36ab834\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.935604 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cjn5\" (UniqueName: \"kubernetes.io/projected/dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f-kube-api-access-2cjn5\") pod \"nova-operator-controller-manager-7c7fc454ff-28ncp\" (UID: \"dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.935645 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47df3c50-6937-4479-8463-4d6816e354d4-cert\") pod \"infra-operator-controller-manager-658588b8c9-b4l9q\" (UID: \"47df3c50-6937-4479-8463-4d6816e354d4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.935675 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr8j9\" (UniqueName: \"kubernetes.io/projected/1edc1259-6db8-4b89-86fa-6410a3d5931d-kube-api-access-sr8j9\") pod \"neutron-operator-controller-manager-8d984cc4d-f7qpd\" (UID: \"1edc1259-6db8-4b89-86fa-6410a3d5931d\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.935698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jd2j\" (UniqueName: \"kubernetes.io/projected/47df3c50-6937-4479-8463-4d6816e354d4-kube-api-access-6jd2j\") pod \"infra-operator-controller-manager-658588b8c9-b4l9q\" (UID: \"47df3c50-6937-4479-8463-4d6816e354d4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.935727 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmqkd\" (UniqueName: \"kubernetes.io/projected/d7327559-8f4c-4745-acde-51a8ec9ca67a-kube-api-access-kmqkd\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-8xkj4\" (UID: \"d7327559-8f4c-4745-acde-51a8ec9ca67a\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" Oct 07 14:09:09 crc kubenswrapper[4717]: E1007 14:09:09.936355 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 07 14:09:09 crc kubenswrapper[4717]: E1007 14:09:09.936396 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47df3c50-6937-4479-8463-4d6816e354d4-cert podName:47df3c50-6937-4479-8463-4d6816e354d4 nodeName:}" failed. No retries permitted until 2025-10-07 14:09:10.436379618 +0000 UTC m=+932.264305410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47df3c50-6937-4479-8463-4d6816e354d4-cert") pod "infra-operator-controller-manager-658588b8c9-b4l9q" (UID: "47df3c50-6937-4479-8463-4d6816e354d4") : secret "infra-operator-webhook-server-cert" not found Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.964216 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd"] Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.964983 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjcbz\" (UniqueName: \"kubernetes.io/projected/d0d37d45-9608-4b1d-97b0-62f8b36ab834-kube-api-access-jjcbz\") pod \"mariadb-operator-controller-manager-6cd6d7bdf5-znbv7\" (UID: \"d0d37d45-9608-4b1d-97b0-62f8b36ab834\") " pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.970553 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmqkd\" (UniqueName: \"kubernetes.io/projected/d7327559-8f4c-4745-acde-51a8ec9ca67a-kube-api-access-kmqkd\") pod \"keystone-operator-controller-manager-7b5ccf6d9c-8xkj4\" (UID: \"d7327559-8f4c-4745-acde-51a8ec9ca67a\") " pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" Oct 07 14:09:09 crc kubenswrapper[4717]: I1007 14:09:09.978852 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:09.998681 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jd2j\" (UniqueName: \"kubernetes.io/projected/47df3c50-6937-4479-8463-4d6816e354d4-kube-api-access-6jd2j\") pod \"infra-operator-controller-manager-658588b8c9-b4l9q\" (UID: \"47df3c50-6937-4479-8463-4d6816e354d4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.024546 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htw2r\" (UniqueName: \"kubernetes.io/projected/763bbe43-a00e-4a82-b90c-bd56ab6a516a-kube-api-access-htw2r\") pod \"manila-operator-controller-manager-65d89cfd9f-7stm5\" (UID: \"763bbe43-a00e-4a82-b90c-bd56ab6a516a\") " pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.027062 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.036502 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjn5\" (UniqueName: \"kubernetes.io/projected/dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f-kube-api-access-2cjn5\") pod \"nova-operator-controller-manager-7c7fc454ff-28ncp\" (UID: \"dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.036784 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr8j9\" (UniqueName: \"kubernetes.io/projected/1edc1259-6db8-4b89-86fa-6410a3d5931d-kube-api-access-sr8j9\") pod \"neutron-operator-controller-manager-8d984cc4d-f7qpd\" (UID: \"1edc1259-6db8-4b89-86fa-6410a3d5931d\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.041335 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.051073 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.051268 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.063336 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.070341 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-65gn4" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.077988 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.079105 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.081307 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2nt89" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.089360 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.090396 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.091624 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.094555 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cjn5\" (UniqueName: \"kubernetes.io/projected/dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f-kube-api-access-2cjn5\") pod \"nova-operator-controller-manager-7c7fc454ff-28ncp\" (UID: \"dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f\") " pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.095822 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-b7f7r" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.095997 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.099548 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.101070 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr8j9\" (UniqueName: \"kubernetes.io/projected/1edc1259-6db8-4b89-86fa-6410a3d5931d-kube-api-access-sr8j9\") pod \"neutron-operator-controller-manager-8d984cc4d-f7qpd\" (UID: \"1edc1259-6db8-4b89-86fa-6410a3d5931d\") " pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.103891 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.107150 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.109397 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.118763 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-k28qg" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.120200 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.148241 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6xs7j" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.186076 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.186153 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.193659 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.203237 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.232261 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.235634 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-62cfh" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.249704 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6rc8\" (UniqueName: \"kubernetes.io/projected/9af7f241-0d0a-457e-9332-51d88b1a52d1-kube-api-access-d6rc8\") pod \"ovn-operator-controller-manager-6d8b6f9b9-x5psc\" (UID: \"9af7f241-0d0a-457e-9332-51d88b1a52d1\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.252472 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.253234 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz9cl\" (UniqueName: \"kubernetes.io/projected/af994010-9d35-4fcd-b444-64acb6b65577-kube-api-access-sz9cl\") pod \"octavia-operator-controller-manager-7468f855d8-vlk87\" (UID: \"af994010-9d35-4fcd-b444-64acb6b65577\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.253294 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m4rk\" (UniqueName: \"kubernetes.io/projected/399188c6-e2ce-4c19-93ac-aec1a685d28c-kube-api-access-2m4rk\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg\" (UID: \"399188c6-e2ce-4c19-93ac-aec1a685d28c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.253329 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5bx6\" (UniqueName: \"kubernetes.io/projected/21e03a5b-f9f2-4e57-90d3-03edcbb7e2db-kube-api-access-w5bx6\") pod \"swift-operator-controller-manager-6859f9b676-slwnb\" (UID: \"21e03a5b-f9f2-4e57-90d3-03edcbb7e2db\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.253388 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndzpc\" (UniqueName: \"kubernetes.io/projected/b2ac5ce4-5123-4d82-aca2-a776d4f89f09-kube-api-access-ndzpc\") pod \"placement-operator-controller-manager-54689d9f88-4dfwm\" (UID: \"b2ac5ce4-5123-4d82-aca2-a776d4f89f09\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.253454 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/399188c6-e2ce-4c19-93ac-aec1a685d28c-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg\" (UID: \"399188c6-e2ce-4c19-93ac-aec1a685d28c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.272246 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.297422 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.298681 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.303783 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-fnwxq" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.309586 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.314072 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.323332 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.357160 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6x6z\" (UniqueName: \"kubernetes.io/projected/c046dbff-c7b3-464b-97c7-ae47f24bcd61-kube-api-access-r6x6z\") pod \"telemetry-operator-controller-manager-5d4d74dd89-bxcrn\" (UID: \"c046dbff-c7b3-464b-97c7-ae47f24bcd61\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.357223 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/399188c6-e2ce-4c19-93ac-aec1a685d28c-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg\" (UID: \"399188c6-e2ce-4c19-93ac-aec1a685d28c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.357257 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6rc8\" (UniqueName: \"kubernetes.io/projected/9af7f241-0d0a-457e-9332-51d88b1a52d1-kube-api-access-d6rc8\") pod \"ovn-operator-controller-manager-6d8b6f9b9-x5psc\" (UID: \"9af7f241-0d0a-457e-9332-51d88b1a52d1\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.357342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz9cl\" (UniqueName: \"kubernetes.io/projected/af994010-9d35-4fcd-b444-64acb6b65577-kube-api-access-sz9cl\") pod \"octavia-operator-controller-manager-7468f855d8-vlk87\" (UID: \"af994010-9d35-4fcd-b444-64acb6b65577\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.357388 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m4rk\" (UniqueName: \"kubernetes.io/projected/399188c6-e2ce-4c19-93ac-aec1a685d28c-kube-api-access-2m4rk\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg\" (UID: \"399188c6-e2ce-4c19-93ac-aec1a685d28c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.357404 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5bx6\" (UniqueName: \"kubernetes.io/projected/21e03a5b-f9f2-4e57-90d3-03edcbb7e2db-kube-api-access-w5bx6\") pod \"swift-operator-controller-manager-6859f9b676-slwnb\" (UID: \"21e03a5b-f9f2-4e57-90d3-03edcbb7e2db\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.357437 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndzpc\" (UniqueName: \"kubernetes.io/projected/b2ac5ce4-5123-4d82-aca2-a776d4f89f09-kube-api-access-ndzpc\") pod \"placement-operator-controller-manager-54689d9f88-4dfwm\" (UID: \"b2ac5ce4-5123-4d82-aca2-a776d4f89f09\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" Oct 07 14:09:10 crc kubenswrapper[4717]: E1007 14:09:10.357873 4717 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 14:09:10 crc kubenswrapper[4717]: E1007 14:09:10.357914 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/399188c6-e2ce-4c19-93ac-aec1a685d28c-cert podName:399188c6-e2ce-4c19-93ac-aec1a685d28c nodeName:}" failed. No retries permitted until 2025-10-07 14:09:10.85790061 +0000 UTC m=+932.685826402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/399188c6-e2ce-4c19-93ac-aec1a685d28c-cert") pod "openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" (UID: "399188c6-e2ce-4c19-93ac-aec1a685d28c") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.372112 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.373279 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.384382 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m4rk\" (UniqueName: \"kubernetes.io/projected/399188c6-e2ce-4c19-93ac-aec1a685d28c-kube-api-access-2m4rk\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg\" (UID: \"399188c6-e2ce-4c19-93ac-aec1a685d28c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.389350 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-8lhpq" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.392810 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.398177 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6rc8\" (UniqueName: \"kubernetes.io/projected/9af7f241-0d0a-457e-9332-51d88b1a52d1-kube-api-access-d6rc8\") pod \"ovn-operator-controller-manager-6d8b6f9b9-x5psc\" (UID: \"9af7f241-0d0a-457e-9332-51d88b1a52d1\") " pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.398989 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndzpc\" (UniqueName: \"kubernetes.io/projected/b2ac5ce4-5123-4d82-aca2-a776d4f89f09-kube-api-access-ndzpc\") pod \"placement-operator-controller-manager-54689d9f88-4dfwm\" (UID: \"b2ac5ce4-5123-4d82-aca2-a776d4f89f09\") " pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.402765 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz9cl\" (UniqueName: \"kubernetes.io/projected/af994010-9d35-4fcd-b444-64acb6b65577-kube-api-access-sz9cl\") pod \"octavia-operator-controller-manager-7468f855d8-vlk87\" (UID: \"af994010-9d35-4fcd-b444-64acb6b65577\") " pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.419893 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5bx6\" (UniqueName: \"kubernetes.io/projected/21e03a5b-f9f2-4e57-90d3-03edcbb7e2db-kube-api-access-w5bx6\") pod \"swift-operator-controller-manager-6859f9b676-slwnb\" (UID: \"21e03a5b-f9f2-4e57-90d3-03edcbb7e2db\") " pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.459000 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6x6z\" (UniqueName: \"kubernetes.io/projected/c046dbff-c7b3-464b-97c7-ae47f24bcd61-kube-api-access-r6x6z\") pod \"telemetry-operator-controller-manager-5d4d74dd89-bxcrn\" (UID: \"c046dbff-c7b3-464b-97c7-ae47f24bcd61\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.459086 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47df3c50-6937-4479-8463-4d6816e354d4-cert\") pod \"infra-operator-controller-manager-658588b8c9-b4l9q\" (UID: \"47df3c50-6937-4479-8463-4d6816e354d4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.459139 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jfsc\" (UniqueName: \"kubernetes.io/projected/1b5c23ef-d224-468b-bc22-2d98de7c4132-kube-api-access-8jfsc\") pod \"test-operator-controller-manager-5cd5cb47d7-xsmnx\" (UID: \"1b5c23ef-d224-468b-bc22-2d98de7c4132\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" Oct 07 14:09:10 crc kubenswrapper[4717]: E1007 14:09:10.459518 4717 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 07 14:09:10 crc kubenswrapper[4717]: E1007 14:09:10.459560 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47df3c50-6937-4479-8463-4d6816e354d4-cert podName:47df3c50-6937-4479-8463-4d6816e354d4 nodeName:}" failed. No retries permitted until 2025-10-07 14:09:11.459547369 +0000 UTC m=+933.287473151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/47df3c50-6937-4479-8463-4d6816e354d4-cert") pod "infra-operator-controller-manager-658588b8c9-b4l9q" (UID: "47df3c50-6937-4479-8463-4d6816e354d4") : secret "infra-operator-webhook-server-cert" not found Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.459847 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-574b968964-27nb9"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.461296 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.465695 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.465950 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-k42cl" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.478128 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-574b968964-27nb9"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.479612 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6x6z\" (UniqueName: \"kubernetes.io/projected/c046dbff-c7b3-464b-97c7-ae47f24bcd61-kube-api-access-r6x6z\") pod \"telemetry-operator-controller-manager-5d4d74dd89-bxcrn\" (UID: \"c046dbff-c7b3-464b-97c7-ae47f24bcd61\") " pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.486296 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.487167 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.492903 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2"] Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.496573 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-ntkhj" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.497718 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.520334 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.560133 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnbmt\" (UniqueName: \"kubernetes.io/projected/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-kube-api-access-bnbmt\") pod \"openstack-operator-controller-manager-574b968964-27nb9\" (UID: \"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19\") " pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.560218 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grm4x\" (UniqueName: \"kubernetes.io/projected/9f385c04-ccd4-4526-ba50-53c7c637a0d3-kube-api-access-grm4x\") pod \"watcher-operator-controller-manager-6cbc6dd547-gtf2z\" (UID: \"9f385c04-ccd4-4526-ba50-53c7c637a0d3\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.560278 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-cert\") pod \"openstack-operator-controller-manager-574b968964-27nb9\" (UID: \"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19\") " pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.560307 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jfsc\" (UniqueName: \"kubernetes.io/projected/1b5c23ef-d224-468b-bc22-2d98de7c4132-kube-api-access-8jfsc\") pod \"test-operator-controller-manager-5cd5cb47d7-xsmnx\" (UID: \"1b5c23ef-d224-468b-bc22-2d98de7c4132\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.576391 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.601899 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jfsc\" (UniqueName: \"kubernetes.io/projected/1b5c23ef-d224-468b-bc22-2d98de7c4132-kube-api-access-8jfsc\") pod \"test-operator-controller-manager-5cd5cb47d7-xsmnx\" (UID: \"1b5c23ef-d224-468b-bc22-2d98de7c4132\") " pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.641256 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.662977 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-cert\") pod \"openstack-operator-controller-manager-574b968964-27nb9\" (UID: \"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19\") " pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.663120 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnbmt\" (UniqueName: \"kubernetes.io/projected/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-kube-api-access-bnbmt\") pod \"openstack-operator-controller-manager-574b968964-27nb9\" (UID: \"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19\") " pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.663177 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d62zz\" (UniqueName: \"kubernetes.io/projected/6c49d800-49ef-4cb3-894a-632b519b22a8-kube-api-access-d62zz\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-z65d2\" (UID: \"6c49d800-49ef-4cb3-894a-632b519b22a8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.663199 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grm4x\" (UniqueName: \"kubernetes.io/projected/9f385c04-ccd4-4526-ba50-53c7c637a0d3-kube-api-access-grm4x\") pod \"watcher-operator-controller-manager-6cbc6dd547-gtf2z\" (UID: \"9f385c04-ccd4-4526-ba50-53c7c637a0d3\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" Oct 07 14:09:10 crc kubenswrapper[4717]: E1007 14:09:10.663662 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 07 14:09:10 crc kubenswrapper[4717]: E1007 14:09:10.663699 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-cert podName:a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19 nodeName:}" failed. No retries permitted until 2025-10-07 14:09:11.163685121 +0000 UTC m=+932.991610913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-cert") pod "openstack-operator-controller-manager-574b968964-27nb9" (UID: "a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19") : secret "webhook-server-cert" not found Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.674748 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.686594 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.697201 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grm4x\" (UniqueName: \"kubernetes.io/projected/9f385c04-ccd4-4526-ba50-53c7c637a0d3-kube-api-access-grm4x\") pod \"watcher-operator-controller-manager-6cbc6dd547-gtf2z\" (UID: \"9f385c04-ccd4-4526-ba50-53c7c637a0d3\") " pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.704259 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnbmt\" (UniqueName: \"kubernetes.io/projected/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-kube-api-access-bnbmt\") pod \"openstack-operator-controller-manager-574b968964-27nb9\" (UID: \"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19\") " pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.708531 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.764827 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d62zz\" (UniqueName: \"kubernetes.io/projected/6c49d800-49ef-4cb3-894a-632b519b22a8-kube-api-access-d62zz\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-z65d2\" (UID: \"6c49d800-49ef-4cb3-894a-632b519b22a8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.782357 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d62zz\" (UniqueName: \"kubernetes.io/projected/6c49d800-49ef-4cb3-894a-632b519b22a8-kube-api-access-d62zz\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-z65d2\" (UID: \"6c49d800-49ef-4cb3-894a-632b519b22a8\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.866756 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/399188c6-e2ce-4c19-93ac-aec1a685d28c-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg\" (UID: \"399188c6-e2ce-4c19-93ac-aec1a685d28c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.878865 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/399188c6-e2ce-4c19-93ac-aec1a685d28c-cert\") pod \"openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg\" (UID: \"399188c6-e2ce-4c19-93ac-aec1a685d28c\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:10 crc kubenswrapper[4717]: I1007 14:09:10.980913 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.007597 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.016107 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.049617 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.056761 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2" Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.108827 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.173737 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-cert\") pod \"openstack-operator-controller-manager-574b968964-27nb9\" (UID: \"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19\") " pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:11 crc kubenswrapper[4717]: E1007 14:09:11.173956 4717 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 07 14:09:11 crc kubenswrapper[4717]: E1007 14:09:11.174024 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-cert podName:a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19 nodeName:}" failed. No retries permitted until 2025-10-07 14:09:12.173992524 +0000 UTC m=+934.001918316 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-cert") pod "openstack-operator-controller-manager-574b968964-27nb9" (UID: "a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19") : secret "webhook-server-cert" not found Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.207122 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" event={"ID":"303da62c-428b-4494-9ad6-4168652dcd4e","Type":"ContainerStarted","Data":"19203c49fbc011f1e0dbe430b083e65973458a7d14587aef3448cccb2cf7d2f3"} Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.207787 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" event={"ID":"c56f0658-961c-4727-8ba8-5d24af8523dd","Type":"ContainerStarted","Data":"31c3f253bf427471d1a4144222f81b9cbd28899a2a38aafd07c67b5bcecfd56e"} Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.208551 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" event={"ID":"ee9e55bf-132a-44ef-82ca-0e1ac422afd3","Type":"ContainerStarted","Data":"c213406892ee3cf24a8e5e2bf33f9f1743d58caddfdb2cf818364c3c77237e9a"} Oct 07 14:09:11 crc kubenswrapper[4717]: W1007 14:09:11.338292 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa98a33_f3d0_49ee_8909_a88e320aa26c.slice/crio-85f655d329b32cd3df94f8e2f0ef8ecf724d5a8a0f4798b60d376d05281b4c1b WatchSource:0}: Error finding container 85f655d329b32cd3df94f8e2f0ef8ecf724d5a8a0f4798b60d376d05281b4c1b: Status 404 returned error can't find the container with id 85f655d329b32cd3df94f8e2f0ef8ecf724d5a8a0f4798b60d376d05281b4c1b Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.343118 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx"] Oct 07 14:09:11 crc kubenswrapper[4717]: W1007 14:09:11.343346 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd500c0_e80e_4553_affd_c3b5437d67b7.slice/crio-9e14c04093d7a75098ca8f6c918149564e9363c58b9d0962f5fc18d8be4c064c WatchSource:0}: Error finding container 9e14c04093d7a75098ca8f6c918149564e9363c58b9d0962f5fc18d8be4c064c: Status 404 returned error can't find the container with id 9e14c04093d7a75098ca8f6c918149564e9363c58b9d0962f5fc18d8be4c064c Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.348083 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.446949 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx"] Oct 07 14:09:11 crc kubenswrapper[4717]: W1007 14:09:11.450830 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4fabdda_208b_48e9_b4d9_85e638c74ad4.slice/crio-99466a667bfc290fdfed5d9ff310092de28ecd9d65afee27ec33f964e0c47d67 WatchSource:0}: Error finding container 99466a667bfc290fdfed5d9ff310092de28ecd9d65afee27ec33f964e0c47d67: Status 404 returned error can't find the container with id 99466a667bfc290fdfed5d9ff310092de28ecd9d65afee27ec33f964e0c47d67 Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.478350 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47df3c50-6937-4479-8463-4d6816e354d4-cert\") pod \"infra-operator-controller-manager-658588b8c9-b4l9q\" (UID: \"47df3c50-6937-4479-8463-4d6816e354d4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.483314 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/47df3c50-6937-4479-8463-4d6816e354d4-cert\") pod \"infra-operator-controller-manager-658588b8c9-b4l9q\" (UID: \"47df3c50-6937-4479-8463-4d6816e354d4\") " pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.535541 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.633876 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.649317 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.677242 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.711176 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.715065 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.720313 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.724666 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.726447 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.730302 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.733700 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.737625 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.740932 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd"] Oct 07 14:09:11 crc kubenswrapper[4717]: W1007 14:09:11.746247 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0d37d45_9608_4b1d_97b0_62f8b36ab834.slice/crio-463be57810db0b7fe9dc7267f809b52d0680f9e3c502f85aabcd3387bf60140e WatchSource:0}: Error finding container 463be57810db0b7fe9dc7267f809b52d0680f9e3c502f85aabcd3387bf60140e: Status 404 returned error can't find the container with id 463be57810db0b7fe9dc7267f809b52d0680f9e3c502f85aabcd3387bf60140e Oct 07 14:09:11 crc kubenswrapper[4717]: E1007 14:09:11.751121 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r6x6z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5d4d74dd89-bxcrn_openstack-operators(c046dbff-c7b3-464b-97c7-ae47f24bcd61): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 14:09:11 crc kubenswrapper[4717]: W1007 14:09:11.753167 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e03a5b_f9f2_4e57_90d3_03edcbb7e2db.slice/crio-9acb3b5d9821ffdc756ecc55032368caa9bce58a1fe6d15519d3f108371ef69b WatchSource:0}: Error finding container 9acb3b5d9821ffdc756ecc55032368caa9bce58a1fe6d15519d3f108371ef69b: Status 404 returned error can't find the container with id 9acb3b5d9821ffdc756ecc55032368caa9bce58a1fe6d15519d3f108371ef69b Oct 07 14:09:11 crc kubenswrapper[4717]: E1007 14:09:11.753283 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sr8j9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-8d984cc4d-f7qpd_openstack-operators(1edc1259-6db8-4b89-86fa-6410a3d5931d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 14:09:11 crc kubenswrapper[4717]: W1007 14:09:11.753871 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f385c04_ccd4_4526_ba50_53c7c637a0d3.slice/crio-392acac494309d2724fa5bbe6d2f80b6ea8c208426a296bdefcb40f8654f0e94 WatchSource:0}: Error finding container 392acac494309d2724fa5bbe6d2f80b6ea8c208426a296bdefcb40f8654f0e94: Status 404 returned error can't find the container with id 392acac494309d2724fa5bbe6d2f80b6ea8c208426a296bdefcb40f8654f0e94 Oct 07 14:09:11 crc kubenswrapper[4717]: E1007 14:09:11.754347 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w5bx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-6859f9b676-slwnb_openstack-operators(21e03a5b-f9f2-4e57-90d3-03edcbb7e2db): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 14:09:11 crc kubenswrapper[4717]: E1007 14:09:11.757440 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-grm4x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6cbc6dd547-gtf2z_openstack-operators(9f385c04-ccd4-4526-ba50-53c7c637a0d3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 14:09:11 crc kubenswrapper[4717]: E1007 14:09:11.786391 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jjcbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-6cd6d7bdf5-znbv7_openstack-operators(d0d37d45-9608-4b1d-97b0-62f8b36ab834): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.893272 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2"] Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.897263 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx"] Oct 07 14:09:11 crc kubenswrapper[4717]: W1007 14:09:11.914936 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b5c23ef_d224_468b_bc22_2d98de7c4132.slice/crio-9985d60bea288e15ad5706f7c6124a9c21b217477dd2df150bd7d06880e99e38 WatchSource:0}: Error finding container 9985d60bea288e15ad5706f7c6124a9c21b217477dd2df150bd7d06880e99e38: Status 404 returned error can't find the container with id 9985d60bea288e15ad5706f7c6124a9c21b217477dd2df150bd7d06880e99e38 Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.918780 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg"] Oct 07 14:09:11 crc kubenswrapper[4717]: E1007 14:09:11.919452 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8jfsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5cd5cb47d7-xsmnx_openstack-operators(1b5c23ef-d224-468b-bc22-2d98de7c4132): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 14:09:11 crc kubenswrapper[4717]: E1007 14:09:11.941062 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2m4rk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg_openstack-operators(399188c6-e2ce-4c19-93ac-aec1a685d28c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 14:09:11 crc kubenswrapper[4717]: I1007 14:09:11.996798 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q"] Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.027599 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6jd2j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-658588b8c9-b4l9q_openstack-operators(47df3c50-6937-4479-8463-4d6816e354d4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.141338 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" podUID="1edc1259-6db8-4b89-86fa-6410a3d5931d" Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.141484 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" podUID="9f385c04-ccd4-4526-ba50-53c7c637a0d3" Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.142227 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" podUID="c046dbff-c7b3-464b-97c7-ae47f24bcd61" Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.162311 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" podUID="d0d37d45-9608-4b1d-97b0-62f8b36ab834" Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.163471 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" podUID="21e03a5b-f9f2-4e57-90d3-03edcbb7e2db" Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.166166 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" podUID="399188c6-e2ce-4c19-93ac-aec1a685d28c" Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.166952 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" podUID="1b5c23ef-d224-468b-bc22-2d98de7c4132" Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.190073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-cert\") pod \"openstack-operator-controller-manager-574b968964-27nb9\" (UID: \"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19\") " pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.196777 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19-cert\") pod \"openstack-operator-controller-manager-574b968964-27nb9\" (UID: \"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19\") " pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.222436 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" podUID="47df3c50-6937-4479-8463-4d6816e354d4" Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.222695 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.228087 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" event={"ID":"d7327559-8f4c-4745-acde-51a8ec9ca67a","Type":"ContainerStarted","Data":"47b68f6be66f8bbb7fc944cdfd71d8d16b43de732d08986a81b483ea3e2125cb"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.231863 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" event={"ID":"9f385c04-ccd4-4526-ba50-53c7c637a0d3","Type":"ContainerStarted","Data":"3527f0204fbabad212c02475c04aa6673d41dbf63b00121af00f0000b423655a"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.231900 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" event={"ID":"9f385c04-ccd4-4526-ba50-53c7c637a0d3","Type":"ContainerStarted","Data":"392acac494309d2724fa5bbe6d2f80b6ea8c208426a296bdefcb40f8654f0e94"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.233737 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" event={"ID":"1b5c23ef-d224-468b-bc22-2d98de7c4132","Type":"ContainerStarted","Data":"aa4fa34cc3eeb71af8a3e8c83cff086e6098982f98f73b042ab6bbe35b42985f"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.233774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" event={"ID":"1b5c23ef-d224-468b-bc22-2d98de7c4132","Type":"ContainerStarted","Data":"9985d60bea288e15ad5706f7c6124a9c21b217477dd2df150bd7d06880e99e38"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.235502 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" event={"ID":"1edc1259-6db8-4b89-86fa-6410a3d5931d","Type":"ContainerStarted","Data":"d6eb5dd5a6b83171c677ec66652ca726875fe2ee300e809cdd83b9f46d59139a"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.235528 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" event={"ID":"1edc1259-6db8-4b89-86fa-6410a3d5931d","Type":"ContainerStarted","Data":"5fa0343cb2b91e7744b65625f30bddd1eda352c4be66ee70ec51e4ded67313e0"} Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.248213 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" podUID="1b5c23ef-d224-468b-bc22-2d98de7c4132" Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.248390 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" podUID="1edc1259-6db8-4b89-86fa-6410a3d5931d" Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.248432 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" podUID="9f385c04-ccd4-4526-ba50-53c7c637a0d3" Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.250240 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" event={"ID":"9af7f241-0d0a-457e-9332-51d88b1a52d1","Type":"ContainerStarted","Data":"cfe711c9f09cc25b2f4f901b4074ba26948b46eed7855b1554bda280f44f2318"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.270448 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" event={"ID":"b2ac5ce4-5123-4d82-aca2-a776d4f89f09","Type":"ContainerStarted","Data":"7e66abf3565c81d7dada8d980b7764b2b0789a641580c95b3b357429688c31f4"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.303479 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" event={"ID":"21e03a5b-f9f2-4e57-90d3-03edcbb7e2db","Type":"ContainerStarted","Data":"de734cdfc6393db2a75b5bffa5ad822338e4866152202ba309cc2223c3b41607"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.303517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" event={"ID":"21e03a5b-f9f2-4e57-90d3-03edcbb7e2db","Type":"ContainerStarted","Data":"9acb3b5d9821ffdc756ecc55032368caa9bce58a1fe6d15519d3f108371ef69b"} Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.309390 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" podUID="21e03a5b-f9f2-4e57-90d3-03edcbb7e2db" Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.315315 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" event={"ID":"47df3c50-6937-4479-8463-4d6816e354d4","Type":"ContainerStarted","Data":"2a44197dbb98e25c80dd1dc2118878aa9197cc14e3a3faf66451ca0f802674b4"} Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.330353 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" podUID="47df3c50-6937-4479-8463-4d6816e354d4" Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.340969 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" event={"ID":"acb68432-c913-4c77-bfc8-ef15f9e74a1c","Type":"ContainerStarted","Data":"e8b27d09e51a0d33062a722ed16a5ecd98a4ca9e4cc83d0c78b41d5a281e6a48"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.365186 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" event={"ID":"d0d37d45-9608-4b1d-97b0-62f8b36ab834","Type":"ContainerStarted","Data":"3ccddc5194fc21ba2351ede1c13f46790fa1b8a9b01c9549afe65e6e2984f426"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.365534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" event={"ID":"d0d37d45-9608-4b1d-97b0-62f8b36ab834","Type":"ContainerStarted","Data":"463be57810db0b7fe9dc7267f809b52d0680f9e3c502f85aabcd3387bf60140e"} Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.376251 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" podUID="d0d37d45-9608-4b1d-97b0-62f8b36ab834" Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.399131 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" event={"ID":"0aa98a33-f3d0-49ee-8909-a88e320aa26c","Type":"ContainerStarted","Data":"85f655d329b32cd3df94f8e2f0ef8ecf724d5a8a0f4798b60d376d05281b4c1b"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.400214 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" event={"ID":"763bbe43-a00e-4a82-b90c-bd56ab6a516a","Type":"ContainerStarted","Data":"99f960d82b9402848f293fc6455a02695b24d6575dde388fb49fac4fb454c138"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.401801 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" event={"ID":"af994010-9d35-4fcd-b444-64acb6b65577","Type":"ContainerStarted","Data":"1b422f5676fe33682d9f4157d684c9e9d7a1c2e2b9b3f183917736c3a609f994"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.404936 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" event={"ID":"399188c6-e2ce-4c19-93ac-aec1a685d28c","Type":"ContainerStarted","Data":"c07ccb333f26a2b0f4f93ddc5afbd987c5acb5f16063206ab2c2615931649cc2"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.404983 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" event={"ID":"399188c6-e2ce-4c19-93ac-aec1a685d28c","Type":"ContainerStarted","Data":"b7a6119211f21d5a94a4c1bdc591238e58821d86364095c71ed1504c4325dbf3"} Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.406380 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" podUID="399188c6-e2ce-4c19-93ac-aec1a685d28c" Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.408949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" event={"ID":"dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f","Type":"ContainerStarted","Data":"d5a890c619dc57af647913f2bdedabfbe9b43b0187ed26d6fafc4e60e693d97b"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.428387 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2" event={"ID":"6c49d800-49ef-4cb3-894a-632b519b22a8","Type":"ContainerStarted","Data":"fbb2b350ccc0ab8155e1bfc9972e68339bb9901ee594e497b2cce204f1a77595"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.430960 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" event={"ID":"c046dbff-c7b3-464b-97c7-ae47f24bcd61","Type":"ContainerStarted","Data":"dc7f656a6db0c7b3bdad42d75d15429be5f072bb9bcd776ab50339ec847e62cd"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.430990 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" event={"ID":"c046dbff-c7b3-464b-97c7-ae47f24bcd61","Type":"ContainerStarted","Data":"4ade84c02bf2dde73998175746568f21d65e792d6f155299ed4c0714dad66e6c"} Oct 07 14:09:12 crc kubenswrapper[4717]: E1007 14:09:12.432646 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" podUID="c046dbff-c7b3-464b-97c7-ae47f24bcd61" Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.435532 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" event={"ID":"cfd500c0-e80e-4553-affd-c3b5437d67b7","Type":"ContainerStarted","Data":"9e14c04093d7a75098ca8f6c918149564e9363c58b9d0962f5fc18d8be4c064c"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.447702 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" event={"ID":"b4fabdda-208b-48e9-b4d9-85e638c74ad4","Type":"ContainerStarted","Data":"99466a667bfc290fdfed5d9ff310092de28ecd9d65afee27ec33f964e0c47d67"} Oct 07 14:09:12 crc kubenswrapper[4717]: I1007 14:09:12.829663 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-574b968964-27nb9"] Oct 07 14:09:12 crc kubenswrapper[4717]: W1007 14:09:12.867808 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda300e117_c1c8_4b5f_a37a_f7ce0e2b4f19.slice/crio-c47ff8e71e0c4524aec865dbbd8594b55cd6092a9e6b9b2fb5cea65f521e7ac9 WatchSource:0}: Error finding container c47ff8e71e0c4524aec865dbbd8594b55cd6092a9e6b9b2fb5cea65f521e7ac9: Status 404 returned error can't find the container with id c47ff8e71e0c4524aec865dbbd8594b55cd6092a9e6b9b2fb5cea65f521e7ac9 Oct 07 14:09:13 crc kubenswrapper[4717]: I1007 14:09:13.471299 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" event={"ID":"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19","Type":"ContainerStarted","Data":"2dea76ee6985fa447144b9f7c5cdd57e4882c720346e488f75f78e7e7af32d71"} Oct 07 14:09:13 crc kubenswrapper[4717]: I1007 14:09:13.471604 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" event={"ID":"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19","Type":"ContainerStarted","Data":"9f3aa5924f02e279523dcea289f50947cccd10633203568953e3f2b021bb1de9"} Oct 07 14:09:13 crc kubenswrapper[4717]: I1007 14:09:13.471621 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:13 crc kubenswrapper[4717]: I1007 14:09:13.471632 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" event={"ID":"a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19","Type":"ContainerStarted","Data":"c47ff8e71e0c4524aec865dbbd8594b55cd6092a9e6b9b2fb5cea65f521e7ac9"} Oct 07 14:09:13 crc kubenswrapper[4717]: I1007 14:09:13.480465 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" event={"ID":"47df3c50-6937-4479-8463-4d6816e354d4","Type":"ContainerStarted","Data":"3b32b840e6f9697a7916b990a888e6cb16a1a1f8ad65ca6a76c42d8f7b2ccfe8"} Oct 07 14:09:13 crc kubenswrapper[4717]: E1007 14:09:13.484978 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:dfd044635f9df9ed1d249387fa622177db35cdc72475e1c570617b8d17c64862\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" podUID="1edc1259-6db8-4b89-86fa-6410a3d5931d" Oct 07 14:09:13 crc kubenswrapper[4717]: E1007 14:09:13.485072 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" podUID="47df3c50-6937-4479-8463-4d6816e354d4" Oct 07 14:09:13 crc kubenswrapper[4717]: E1007 14:09:13.485136 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:637bb7b9ac308bc1e323391a3593b824f688090a856c83385814c17a571b1eed\\\"\"" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" podUID="21e03a5b-f9f2-4e57-90d3-03edcbb7e2db" Oct 07 14:09:13 crc kubenswrapper[4717]: E1007 14:09:13.485134 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:0daf76cc40ab619ae266b11defcc1b65beb22d859369e7b1b04de9169089a4cb\\\"\"" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" podUID="1b5c23ef-d224-468b-bc22-2d98de7c4132" Oct 07 14:09:13 crc kubenswrapper[4717]: E1007 14:09:13.485135 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:bf55026ba10b80e1e24733078bd204cef8766d21a305fd000707a1e3b30ff52e\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" podUID="c046dbff-c7b3-464b-97c7-ae47f24bcd61" Oct 07 14:09:13 crc kubenswrapper[4717]: E1007 14:09:13.485173 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:64f57b2b59dea2bd9fae91490c5bec2687131884a049e6579819d9f951b877c6\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" podUID="9f385c04-ccd4-4526-ba50-53c7c637a0d3" Oct 07 14:09:13 crc kubenswrapper[4717]: E1007 14:09:13.485175 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:bcd1acac74e68eea5a9c3b7ba1bcb29d3a5b43423fc23c19ad4715bdac41f799\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" podUID="399188c6-e2ce-4c19-93ac-aec1a685d28c" Oct 07 14:09:13 crc kubenswrapper[4717]: E1007 14:09:13.485216 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5f96b563a63494082323bfced089d6589e0c89db43c6a39a2e912c79b1a278fe\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" podUID="d0d37d45-9608-4b1d-97b0-62f8b36ab834" Oct 07 14:09:13 crc kubenswrapper[4717]: I1007 14:09:13.526420 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" podStartSLOduration=3.5263992760000002 podStartE2EDuration="3.526399276s" podCreationTimestamp="2025-10-07 14:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:09:13.522491197 +0000 UTC m=+935.350416989" watchObservedRunningTime="2025-10-07 14:09:13.526399276 +0000 UTC m=+935.354325068" Oct 07 14:09:14 crc kubenswrapper[4717]: E1007 14:09:14.488832 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:b6cef68bfaacdf992a9fa1a6b03a848a48c18cbb6ed12d95561b4b37d858b99f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" podUID="47df3c50-6937-4479-8463-4d6816e354d4" Oct 07 14:09:22 crc kubenswrapper[4717]: I1007 14:09:22.228946 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-574b968964-27nb9" Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.549034 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" event={"ID":"ee9e55bf-132a-44ef-82ca-0e1ac422afd3","Type":"ContainerStarted","Data":"5f715030eafe1688a7b6433dbc2a80cdeccd5e00f113cd188f5330fce841d701"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.549335 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" event={"ID":"ee9e55bf-132a-44ef-82ca-0e1ac422afd3","Type":"ContainerStarted","Data":"9374b04419cde9ecfe0ca04502d1d48d3009ee3dcd9211c7684ddda44ade994a"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.549349 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.574680 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" event={"ID":"cfd500c0-e80e-4553-affd-c3b5437d67b7","Type":"ContainerStarted","Data":"6974d78c82d08bb2d93b489b5c27ba266c62e2f8a5c040b92790c14c6cfb9629"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.574774 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.574790 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" event={"ID":"cfd500c0-e80e-4553-affd-c3b5437d67b7","Type":"ContainerStarted","Data":"18426c341c4386a773248fb22ca63594b9671b305419ef23c0cf62190e21746d"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.583884 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" event={"ID":"d7327559-8f4c-4745-acde-51a8ec9ca67a","Type":"ContainerStarted","Data":"5d4f86915bbe0e7bfc2484a320814476384b42d073bdb5956c38cc588ee65457"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.587389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" event={"ID":"b4fabdda-208b-48e9-b4d9-85e638c74ad4","Type":"ContainerStarted","Data":"da3c6f066db7393a1c1e4cb6c2d0b5835fe79ad7a5211b6ce02b4d350f5a72ef"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.587449 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" event={"ID":"b4fabdda-208b-48e9-b4d9-85e638c74ad4","Type":"ContainerStarted","Data":"fe8db3f4eff1ed114a3ae97d81b2727d230feeaa5615511303d0b1c8331d7c67"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.587581 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.597477 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" podStartSLOduration=3.37820095 podStartE2EDuration="14.597459104s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.109206451 +0000 UTC m=+932.937132243" lastFinishedPulling="2025-10-07 14:09:22.328464595 +0000 UTC m=+944.156390397" observedRunningTime="2025-10-07 14:09:23.597427983 +0000 UTC m=+945.425353785" watchObservedRunningTime="2025-10-07 14:09:23.597459104 +0000 UTC m=+945.425384896" Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.599960 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" event={"ID":"dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f","Type":"ContainerStarted","Data":"e306aec484da5d9d8eac606cc0856485fb0927fa57c9e7a23e7039a90c8af977"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.612417 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" event={"ID":"acb68432-c913-4c77-bfc8-ef15f9e74a1c","Type":"ContainerStarted","Data":"313222ef17ecfec917e522525f20dd4054a3762923eda99e4a609376b20d0978"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.629460 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" event={"ID":"af994010-9d35-4fcd-b444-64acb6b65577","Type":"ContainerStarted","Data":"3abb56d656054c5b742bb569666e883356ecf9cbf8cf0f0af42f862a00681976"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.636032 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" podStartSLOduration=3.767011831 podStartE2EDuration="14.635998967s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.45268473 +0000 UTC m=+933.280610522" lastFinishedPulling="2025-10-07 14:09:22.321671866 +0000 UTC m=+944.149597658" observedRunningTime="2025-10-07 14:09:23.629504586 +0000 UTC m=+945.457430378" watchObservedRunningTime="2025-10-07 14:09:23.635998967 +0000 UTC m=+945.463924759" Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.638774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" event={"ID":"9af7f241-0d0a-457e-9332-51d88b1a52d1","Type":"ContainerStarted","Data":"b12d564846a713d8dc6d16217db81492ddfa8684a6b1b185029e72fee2ed4453"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.645857 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" event={"ID":"0aa98a33-f3d0-49ee-8909-a88e320aa26c","Type":"ContainerStarted","Data":"9baf717dd166cf4cbb3556002e93e1cfb5e6115b9041ebce184c83020df513f4"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.645901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" event={"ID":"0aa98a33-f3d0-49ee-8909-a88e320aa26c","Type":"ContainerStarted","Data":"3472ed98974f5c973354e0d1f5c9922cd4789f7ab86ee47b19c00b638ac23d72"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.646571 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.648115 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" event={"ID":"b2ac5ce4-5123-4d82-aca2-a776d4f89f09","Type":"ContainerStarted","Data":"2ed45617c28ab96a74ce01b0445eeb316eb5ba0c2913a08b9c04db62c1ca38e8"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.650637 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" event={"ID":"763bbe43-a00e-4a82-b90c-bd56ab6a516a","Type":"ContainerStarted","Data":"6e57f994d490642b2a52671275fca2334f21c4ab293a5e51da3c785bcab594e8"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.657486 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" podStartSLOduration=3.619737162 podStartE2EDuration="14.657470284s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.346823184 +0000 UTC m=+933.174748966" lastFinishedPulling="2025-10-07 14:09:22.384556296 +0000 UTC m=+944.212482088" observedRunningTime="2025-10-07 14:09:23.650824129 +0000 UTC m=+945.478749921" watchObservedRunningTime="2025-10-07 14:09:23.657470284 +0000 UTC m=+945.485396076" Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.661209 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" event={"ID":"303da62c-428b-4494-9ad6-4168652dcd4e","Type":"ContainerStarted","Data":"2765794563eeb0caab08587981ade50a4e4adf3872d4240b312586b0f31470e9"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.687427 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" event={"ID":"c56f0658-961c-4727-8ba8-5d24af8523dd","Type":"ContainerStarted","Data":"75b508068636fe3643c0c065b08bcaf67ac5ff96773626c1eb44d748c4da8bb4"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.687478 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" event={"ID":"c56f0658-961c-4727-8ba8-5d24af8523dd","Type":"ContainerStarted","Data":"558df2894351611e990418fb887cf3c55857085e8c45ecd21982c76aaa7e9808"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.688181 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.692601 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" podStartSLOduration=3.749917685 podStartE2EDuration="14.692582872s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.340362974 +0000 UTC m=+933.168288766" lastFinishedPulling="2025-10-07 14:09:22.283028161 +0000 UTC m=+944.110953953" observedRunningTime="2025-10-07 14:09:23.689933748 +0000 UTC m=+945.517859540" watchObservedRunningTime="2025-10-07 14:09:23.692582872 +0000 UTC m=+945.520508664" Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.704612 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2" event={"ID":"6c49d800-49ef-4cb3-894a-632b519b22a8","Type":"ContainerStarted","Data":"4d790b29ef38ad1ffc866acc945e318a3ad33307b3ae8b4077c1b02bba3c31be"} Oct 07 14:09:23 crc kubenswrapper[4717]: I1007 14:09:23.721573 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" podStartSLOduration=3.5623398550000003 podStartE2EDuration="14.721554158s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.108554033 +0000 UTC m=+932.936479825" lastFinishedPulling="2025-10-07 14:09:22.267768336 +0000 UTC m=+944.095694128" observedRunningTime="2025-10-07 14:09:23.721243989 +0000 UTC m=+945.549169781" watchObservedRunningTime="2025-10-07 14:09:23.721554158 +0000 UTC m=+945.549479950" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.712097 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" event={"ID":"d7327559-8f4c-4745-acde-51a8ec9ca67a","Type":"ContainerStarted","Data":"5a35246d298a1fb87ab2992450cc076850eb855879ad2fb65e09226e5c06da8e"} Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.712387 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.713564 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" event={"ID":"763bbe43-a00e-4a82-b90c-bd56ab6a516a","Type":"ContainerStarted","Data":"6e76b8b6e08c189ca0f30382843b842c21c18b40db353c11e329285faa060748"} Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.713615 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.715254 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" event={"ID":"af994010-9d35-4fcd-b444-64acb6b65577","Type":"ContainerStarted","Data":"80df6284dc259fef4c0b5ff5fa54f7c7977485b5d71154f5748253584ca9e9e6"} Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.715358 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.716666 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" event={"ID":"9af7f241-0d0a-457e-9332-51d88b1a52d1","Type":"ContainerStarted","Data":"e92e9b6110324711366934b231ad184608ed4c5d2c982f616eb78c24b9f1923a"} Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.717053 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.718683 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" event={"ID":"dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f","Type":"ContainerStarted","Data":"37c2351f046706e80949c125aca99e2bd0cad56043519bdbdf643106d097c5a2"} Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.718753 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.720105 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" event={"ID":"acb68432-c913-4c77-bfc8-ef15f9e74a1c","Type":"ContainerStarted","Data":"f993a19a127aa9507fdf16d5dae468778600543887ba376e482032b2449e394d"} Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.720194 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.721612 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" event={"ID":"303da62c-428b-4494-9ad6-4168652dcd4e","Type":"ContainerStarted","Data":"0c95bef46838de1f81d7420219faf8fc40833be29b82103d27ce30c5e573645b"} Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.721660 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.725082 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" event={"ID":"b2ac5ce4-5123-4d82-aca2-a776d4f89f09","Type":"ContainerStarted","Data":"84e846767a0f1a7eba6cfc4ec9214cfdb2da72a14f914a3b38d4c7669fde9003"} Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.726482 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.730140 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-z65d2" podStartSLOduration=4.35221477 podStartE2EDuration="14.730123488s" podCreationTimestamp="2025-10-07 14:09:10 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.910478102 +0000 UTC m=+933.738403894" lastFinishedPulling="2025-10-07 14:09:22.28838681 +0000 UTC m=+944.116312612" observedRunningTime="2025-10-07 14:09:23.744348552 +0000 UTC m=+945.572274344" watchObservedRunningTime="2025-10-07 14:09:24.730123488 +0000 UTC m=+946.558049280" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.733696 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" podStartSLOduration=5.112875619 podStartE2EDuration="15.733683867s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.705271331 +0000 UTC m=+933.533197123" lastFinishedPulling="2025-10-07 14:09:22.326079559 +0000 UTC m=+944.154005371" observedRunningTime="2025-10-07 14:09:24.728989476 +0000 UTC m=+946.556915268" watchObservedRunningTime="2025-10-07 14:09:24.733683867 +0000 UTC m=+946.561609659" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.751683 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" podStartSLOduration=5.162468059 podStartE2EDuration="15.751663427s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.738560337 +0000 UTC m=+933.566486129" lastFinishedPulling="2025-10-07 14:09:22.327755695 +0000 UTC m=+944.155681497" observedRunningTime="2025-10-07 14:09:24.742263846 +0000 UTC m=+946.570189638" watchObservedRunningTime="2025-10-07 14:09:24.751663427 +0000 UTC m=+946.579589219" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.782279 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" podStartSLOduration=5.245347866 podStartE2EDuration="15.782258029s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.73074636 +0000 UTC m=+933.558672152" lastFinishedPulling="2025-10-07 14:09:22.267656533 +0000 UTC m=+944.095582315" observedRunningTime="2025-10-07 14:09:24.764828304 +0000 UTC m=+946.592754096" watchObservedRunningTime="2025-10-07 14:09:24.782258029 +0000 UTC m=+946.610183821" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.787113 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" podStartSLOduration=4.567586912 podStartE2EDuration="15.787091283s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.109830898 +0000 UTC m=+932.937756690" lastFinishedPulling="2025-10-07 14:09:22.329335269 +0000 UTC m=+944.157261061" observedRunningTime="2025-10-07 14:09:24.780663304 +0000 UTC m=+946.608589096" watchObservedRunningTime="2025-10-07 14:09:24.787091283 +0000 UTC m=+946.615017075" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.798709 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" podStartSLOduration=5.13954601 podStartE2EDuration="15.798687746s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.669983868 +0000 UTC m=+933.497909660" lastFinishedPulling="2025-10-07 14:09:22.329125594 +0000 UTC m=+944.157051396" observedRunningTime="2025-10-07 14:09:24.796107914 +0000 UTC m=+946.624033716" watchObservedRunningTime="2025-10-07 14:09:24.798687746 +0000 UTC m=+946.626613578" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.812597 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" podStartSLOduration=5.272139942 podStartE2EDuration="15.812577033s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.748468263 +0000 UTC m=+933.576394055" lastFinishedPulling="2025-10-07 14:09:22.288905354 +0000 UTC m=+944.116831146" observedRunningTime="2025-10-07 14:09:24.810894386 +0000 UTC m=+946.638820198" watchObservedRunningTime="2025-10-07 14:09:24.812577033 +0000 UTC m=+946.640502825" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.826447 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" podStartSLOduration=5.192171826 podStartE2EDuration="15.826430228s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.695727005 +0000 UTC m=+933.523652787" lastFinishedPulling="2025-10-07 14:09:22.329985397 +0000 UTC m=+944.157911189" observedRunningTime="2025-10-07 14:09:24.823711832 +0000 UTC m=+946.651637624" watchObservedRunningTime="2025-10-07 14:09:24.826430228 +0000 UTC m=+946.654356020" Oct 07 14:09:24 crc kubenswrapper[4717]: I1007 14:09:24.840795 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" podStartSLOduration=5.260380555 podStartE2EDuration="15.840777268s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.748790442 +0000 UTC m=+933.576716234" lastFinishedPulling="2025-10-07 14:09:22.329187165 +0000 UTC m=+944.157112947" observedRunningTime="2025-10-07 14:09:24.836223781 +0000 UTC m=+946.664149583" watchObservedRunningTime="2025-10-07 14:09:24.840777268 +0000 UTC m=+946.668703060" Oct 07 14:09:29 crc kubenswrapper[4717]: I1007 14:09:29.819222 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-58c4cd55f4-xd2vv" Oct 07 14:09:29 crc kubenswrapper[4717]: I1007 14:09:29.828268 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-7d4d4f8d-sd746" Oct 07 14:09:29 crc kubenswrapper[4717]: I1007 14:09:29.869990 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-75dfd9b554-pfxp7" Oct 07 14:09:29 crc kubenswrapper[4717]: I1007 14:09:29.873937 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5dc44df7d5-8xb4k" Oct 07 14:09:29 crc kubenswrapper[4717]: I1007 14:09:29.921622 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-54b4974c45-8zlpx" Oct 07 14:09:29 crc kubenswrapper[4717]: I1007 14:09:29.947267 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-76d5b87f47-q8qkj" Oct 07 14:09:30 crc kubenswrapper[4717]: I1007 14:09:30.067054 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-649675d675-jdbmx" Oct 07 14:09:30 crc kubenswrapper[4717]: I1007 14:09:30.197387 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b5ccf6d9c-8xkj4" Oct 07 14:09:30 crc kubenswrapper[4717]: I1007 14:09:30.255901 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-65d89cfd9f-7stm5" Oct 07 14:09:30 crc kubenswrapper[4717]: I1007 14:09:30.325741 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7c7fc454ff-28ncp" Oct 07 14:09:30 crc kubenswrapper[4717]: I1007 14:09:30.501186 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-6d8b6f9b9-x5psc" Oct 07 14:09:30 crc kubenswrapper[4717]: I1007 14:09:30.579680 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-54689d9f88-4dfwm" Oct 07 14:09:30 crc kubenswrapper[4717]: I1007 14:09:30.693482 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7468f855d8-vlk87" Oct 07 14:09:31 crc kubenswrapper[4717]: I1007 14:09:31.609564 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:09:31 crc kubenswrapper[4717]: I1007 14:09:31.609616 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.854831 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" event={"ID":"9f385c04-ccd4-4526-ba50-53c7c637a0d3","Type":"ContainerStarted","Data":"c6ebf8b6287fc2982fdd55d734db2fd4e871cff3c2b05d07866446187b83bcc2"} Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.855659 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.856917 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" event={"ID":"1b5c23ef-d224-468b-bc22-2d98de7c4132","Type":"ContainerStarted","Data":"073ee72f3af935185b91600bb5d498d0f1048050ebbef09c0bd4b96e63ef80db"} Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.857286 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.858797 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" event={"ID":"c046dbff-c7b3-464b-97c7-ae47f24bcd61","Type":"ContainerStarted","Data":"e0df84370a13acf957d742f07549b29afce3b4399fbe057fb8a962a98bf7763d"} Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.858963 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.860549 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" event={"ID":"1edc1259-6db8-4b89-86fa-6410a3d5931d","Type":"ContainerStarted","Data":"b2c912871e59df8e1e8fe0ca9f382d7c1d8a1e530082cce47f789fd94b864647"} Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.860733 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.861937 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" event={"ID":"399188c6-e2ce-4c19-93ac-aec1a685d28c","Type":"ContainerStarted","Data":"707f4a2aba7198da771e6aefb0d7c5cbc92fa8f8008faf3d14cd475597838a99"} Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.862153 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.863609 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" event={"ID":"21e03a5b-f9f2-4e57-90d3-03edcbb7e2db","Type":"ContainerStarted","Data":"d30b5a9da8812e197e1a2fdbae66396a526b6bdebb555d65a4eb5cac4a4ce8a6"} Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.863942 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.865116 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" event={"ID":"47df3c50-6937-4479-8463-4d6816e354d4","Type":"ContainerStarted","Data":"63c8371923010018682ce8ee1285684d49baf8abe3e15a3eea501c9ed7930cb8"} Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.865447 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.866357 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" event={"ID":"d0d37d45-9608-4b1d-97b0-62f8b36ab834","Type":"ContainerStarted","Data":"a3bd597d606d1d1db77a38f19265e0bc1eddee1e15c7353f5cb16d771ae8ae60"} Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.866505 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.875951 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" podStartSLOduration=2.926775508 podStartE2EDuration="30.875930497s" podCreationTimestamp="2025-10-07 14:09:10 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.75733464 +0000 UTC m=+933.585260432" lastFinishedPulling="2025-10-07 14:09:39.706489629 +0000 UTC m=+961.534415421" observedRunningTime="2025-10-07 14:09:40.873479068 +0000 UTC m=+962.701404860" watchObservedRunningTime="2025-10-07 14:09:40.875930497 +0000 UTC m=+962.703856289" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.916305 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" podStartSLOduration=3.9665418040000002 podStartE2EDuration="31.91628999s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.753214495 +0000 UTC m=+933.581140287" lastFinishedPulling="2025-10-07 14:09:39.702962681 +0000 UTC m=+961.530888473" observedRunningTime="2025-10-07 14:09:40.914884741 +0000 UTC m=+962.742810533" watchObservedRunningTime="2025-10-07 14:09:40.91628999 +0000 UTC m=+962.744215782" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.918308 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" podStartSLOduration=4.152985943 podStartE2EDuration="31.918297926s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.940363693 +0000 UTC m=+933.768289485" lastFinishedPulling="2025-10-07 14:09:39.705675676 +0000 UTC m=+961.533601468" observedRunningTime="2025-10-07 14:09:40.903667719 +0000 UTC m=+962.731593511" watchObservedRunningTime="2025-10-07 14:09:40.918297926 +0000 UTC m=+962.746223718" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.932197 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" podStartSLOduration=3.988480065 podStartE2EDuration="31.932180042s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.750993703 +0000 UTC m=+933.578919495" lastFinishedPulling="2025-10-07 14:09:39.69469368 +0000 UTC m=+961.522619472" observedRunningTime="2025-10-07 14:09:40.927659496 +0000 UTC m=+962.755585308" watchObservedRunningTime="2025-10-07 14:09:40.932180042 +0000 UTC m=+962.760105834" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.949164 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" podStartSLOduration=4.269924588 podStartE2EDuration="31.949142704s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:12.027290513 +0000 UTC m=+933.855216305" lastFinishedPulling="2025-10-07 14:09:39.706508629 +0000 UTC m=+961.534434421" observedRunningTime="2025-10-07 14:09:40.945216885 +0000 UTC m=+962.773142677" watchObservedRunningTime="2025-10-07 14:09:40.949142704 +0000 UTC m=+962.777068486" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.963533 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" podStartSLOduration=4.03073147 podStartE2EDuration="31.963516684s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.786201353 +0000 UTC m=+933.614127135" lastFinishedPulling="2025-10-07 14:09:39.718986557 +0000 UTC m=+961.546912349" observedRunningTime="2025-10-07 14:09:40.959642287 +0000 UTC m=+962.787568079" watchObservedRunningTime="2025-10-07 14:09:40.963516684 +0000 UTC m=+962.791442476" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.973949 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" podStartSLOduration=4.419405798 podStartE2EDuration="31.973925714s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.919317898 +0000 UTC m=+933.747243680" lastFinishedPulling="2025-10-07 14:09:39.473837804 +0000 UTC m=+961.301763596" observedRunningTime="2025-10-07 14:09:40.973360048 +0000 UTC m=+962.801285840" watchObservedRunningTime="2025-10-07 14:09:40.973925714 +0000 UTC m=+962.801851506" Oct 07 14:09:40 crc kubenswrapper[4717]: I1007 14:09:40.991639 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" podStartSLOduration=5.355280377 podStartE2EDuration="31.991619727s" podCreationTimestamp="2025-10-07 14:09:09 +0000 UTC" firstStartedPulling="2025-10-07 14:09:11.754209403 +0000 UTC m=+933.582135185" lastFinishedPulling="2025-10-07 14:09:38.390548743 +0000 UTC m=+960.218474535" observedRunningTime="2025-10-07 14:09:40.98565303 +0000 UTC m=+962.813578822" watchObservedRunningTime="2025-10-07 14:09:40.991619727 +0000 UTC m=+962.819545519" Oct 07 14:09:49 crc kubenswrapper[4717]: I1007 14:09:49.982271 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6cd6d7bdf5-znbv7" Oct 07 14:09:50 crc kubenswrapper[4717]: I1007 14:09:50.312895 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-8d984cc4d-f7qpd" Oct 07 14:09:50 crc kubenswrapper[4717]: I1007 14:09:50.524066 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6859f9b676-slwnb" Oct 07 14:09:50 crc kubenswrapper[4717]: I1007 14:09:50.644803 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5d4d74dd89-bxcrn" Oct 07 14:09:50 crc kubenswrapper[4717]: I1007 14:09:50.687892 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd5cb47d7-xsmnx" Oct 07 14:09:50 crc kubenswrapper[4717]: I1007 14:09:50.712833 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6cbc6dd547-gtf2z" Oct 07 14:09:51 crc kubenswrapper[4717]: I1007 14:09:51.057089 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg" Oct 07 14:09:51 crc kubenswrapper[4717]: I1007 14:09:51.541035 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-658588b8c9-b4l9q" Oct 07 14:10:01 crc kubenswrapper[4717]: I1007 14:10:01.610046 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:10:01 crc kubenswrapper[4717]: I1007 14:10:01.610545 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:10:01 crc kubenswrapper[4717]: I1007 14:10:01.610596 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:10:01 crc kubenswrapper[4717]: I1007 14:10:01.611208 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90e7e61d077b49456611059f1c1a8bfe24645f6ad56a34f7d9dbdb19bbcf2fdc"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:10:01 crc kubenswrapper[4717]: I1007 14:10:01.611297 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://90e7e61d077b49456611059f1c1a8bfe24645f6ad56a34f7d9dbdb19bbcf2fdc" gracePeriod=600 Oct 07 14:10:02 crc kubenswrapper[4717]: I1007 14:10:02.003469 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="90e7e61d077b49456611059f1c1a8bfe24645f6ad56a34f7d9dbdb19bbcf2fdc" exitCode=0 Oct 07 14:10:02 crc kubenswrapper[4717]: I1007 14:10:02.003540 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"90e7e61d077b49456611059f1c1a8bfe24645f6ad56a34f7d9dbdb19bbcf2fdc"} Oct 07 14:10:02 crc kubenswrapper[4717]: I1007 14:10:02.003887 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"3d269e921064d5e2c67af781cbbaee93e5f7c52bd89888c96165e3264b80d4ba"} Oct 07 14:10:02 crc kubenswrapper[4717]: I1007 14:10:02.003911 4717 scope.go:117] "RemoveContainer" containerID="7391daae6696cb6ca21332fa8ceb60a843dc3173f8674514bf587cc8767d21d3" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.112610 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vx6f6"] Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.114635 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.122080 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-k6pj9" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.125058 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.140761 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vx6f6"] Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.193388 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-config\") pod \"dnsmasq-dns-675f4bcbfc-vx6f6\" (UID: \"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.193457 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq57z\" (UniqueName: \"kubernetes.io/projected/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-kube-api-access-hq57z\") pod \"dnsmasq-dns-675f4bcbfc-vx6f6\" (UID: \"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.193911 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rczth"] Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.195175 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.206714 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.217609 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rczth"] Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.294836 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rczth\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.294887 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-config\") pod \"dnsmasq-dns-78dd6ddcc-rczth\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.294917 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-config\") pod \"dnsmasq-dns-675f4bcbfc-vx6f6\" (UID: \"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.294938 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t94lb\" (UniqueName: \"kubernetes.io/projected/35748342-2d62-4655-8602-8467bfc1b3c4-kube-api-access-t94lb\") pod \"dnsmasq-dns-78dd6ddcc-rczth\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.294963 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq57z\" (UniqueName: \"kubernetes.io/projected/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-kube-api-access-hq57z\") pod \"dnsmasq-dns-675f4bcbfc-vx6f6\" (UID: \"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.296044 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-config\") pod \"dnsmasq-dns-675f4bcbfc-vx6f6\" (UID: \"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.320448 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq57z\" (UniqueName: \"kubernetes.io/projected/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-kube-api-access-hq57z\") pod \"dnsmasq-dns-675f4bcbfc-vx6f6\" (UID: \"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.395940 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-config\") pod \"dnsmasq-dns-78dd6ddcc-rczth\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.395998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t94lb\" (UniqueName: \"kubernetes.io/projected/35748342-2d62-4655-8602-8467bfc1b3c4-kube-api-access-t94lb\") pod \"dnsmasq-dns-78dd6ddcc-rczth\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.396081 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rczth\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.396965 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-rczth\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.397163 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-config\") pod \"dnsmasq-dns-78dd6ddcc-rczth\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.412575 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t94lb\" (UniqueName: \"kubernetes.io/projected/35748342-2d62-4655-8602-8467bfc1b3c4-kube-api-access-t94lb\") pod \"dnsmasq-dns-78dd6ddcc-rczth\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.433515 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.523353 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.864515 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vx6f6"] Oct 07 14:10:09 crc kubenswrapper[4717]: I1007 14:10:09.991169 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rczth"] Oct 07 14:10:09 crc kubenswrapper[4717]: W1007 14:10:09.996820 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35748342_2d62_4655_8602_8467bfc1b3c4.slice/crio-08085a681e86b6a02a20ed6dbe4a84144282b4aa600437433e023ff76d72c9e9 WatchSource:0}: Error finding container 08085a681e86b6a02a20ed6dbe4a84144282b4aa600437433e023ff76d72c9e9: Status 404 returned error can't find the container with id 08085a681e86b6a02a20ed6dbe4a84144282b4aa600437433e023ff76d72c9e9 Oct 07 14:10:10 crc kubenswrapper[4717]: I1007 14:10:10.054196 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" event={"ID":"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0","Type":"ContainerStarted","Data":"c33d5a322124185711f16ad92ded8c3b49cc01f391d6b2005264e0d374f046ff"} Oct 07 14:10:10 crc kubenswrapper[4717]: I1007 14:10:10.056359 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" event={"ID":"35748342-2d62-4655-8602-8467bfc1b3c4","Type":"ContainerStarted","Data":"08085a681e86b6a02a20ed6dbe4a84144282b4aa600437433e023ff76d72c9e9"} Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.264235 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vx6f6"] Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.289464 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-r9qng"] Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.290644 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.306136 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-r9qng"] Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.358293 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg448\" (UniqueName: \"kubernetes.io/projected/35731f7d-c41a-4572-99b7-67459d507e0d-kube-api-access-sg448\") pod \"dnsmasq-dns-666b6646f7-r9qng\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.358385 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-config\") pod \"dnsmasq-dns-666b6646f7-r9qng\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.358409 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-r9qng\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.460147 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg448\" (UniqueName: \"kubernetes.io/projected/35731f7d-c41a-4572-99b7-67459d507e0d-kube-api-access-sg448\") pod \"dnsmasq-dns-666b6646f7-r9qng\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.460260 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-config\") pod \"dnsmasq-dns-666b6646f7-r9qng\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.460321 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-r9qng\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.461276 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-dns-svc\") pod \"dnsmasq-dns-666b6646f7-r9qng\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.461293 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-config\") pod \"dnsmasq-dns-666b6646f7-r9qng\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.505569 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg448\" (UniqueName: \"kubernetes.io/projected/35731f7d-c41a-4572-99b7-67459d507e0d-kube-api-access-sg448\") pod \"dnsmasq-dns-666b6646f7-r9qng\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.567319 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rczth"] Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.603760 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dl9s5"] Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.606136 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.612790 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dl9s5"] Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.628559 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.665647 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dl9s5\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.665722 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-config\") pod \"dnsmasq-dns-57d769cc4f-dl9s5\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.665763 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn9b6\" (UniqueName: \"kubernetes.io/projected/5bf5f7d6-f070-4265-aa30-c53c81a623db-kube-api-access-mn9b6\") pod \"dnsmasq-dns-57d769cc4f-dl9s5\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.771474 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dl9s5\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.771542 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-config\") pod \"dnsmasq-dns-57d769cc4f-dl9s5\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.771575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn9b6\" (UniqueName: \"kubernetes.io/projected/5bf5f7d6-f070-4265-aa30-c53c81a623db-kube-api-access-mn9b6\") pod \"dnsmasq-dns-57d769cc4f-dl9s5\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.776964 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dl9s5\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.781834 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-config\") pod \"dnsmasq-dns-57d769cc4f-dl9s5\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:12 crc kubenswrapper[4717]: I1007 14:10:12.803793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn9b6\" (UniqueName: \"kubernetes.io/projected/5bf5f7d6-f070-4265-aa30-c53c81a623db-kube-api-access-mn9b6\") pod \"dnsmasq-dns-57d769cc4f-dl9s5\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.023193 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.173804 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-r9qng"] Oct 07 14:10:13 crc kubenswrapper[4717]: W1007 14:10:13.196387 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35731f7d_c41a_4572_99b7_67459d507e0d.slice/crio-94d754d111d9b3896f9ad967a68ed28a401557c0f8ad6b5c793f8eb15451154f WatchSource:0}: Error finding container 94d754d111d9b3896f9ad967a68ed28a401557c0f8ad6b5c793f8eb15451154f: Status 404 returned error can't find the container with id 94d754d111d9b3896f9ad967a68ed28a401557c0f8ad6b5c793f8eb15451154f Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.435339 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.437398 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.441880 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.442297 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.442446 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.442666 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.442836 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.442905 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tlzgm" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.444210 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.449928 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.507411 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dl9s5"] Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.508898 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.508956 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.508983 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.509171 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.509226 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsxqm\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-kube-api-access-hsxqm\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.509264 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.509331 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.509389 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.509416 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.509477 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-config-data\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.509506 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: W1007 14:10:13.518611 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bf5f7d6_f070_4265_aa30_c53c81a623db.slice/crio-963b549b200c84c9733b4840a172bd4d113ef56cf15aa6744ac6def2cdb3d9f0 WatchSource:0}: Error finding container 963b549b200c84c9733b4840a172bd4d113ef56cf15aa6744ac6def2cdb3d9f0: Status 404 returned error can't find the container with id 963b549b200c84c9733b4840a172bd4d113ef56cf15aa6744ac6def2cdb3d9f0 Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.610741 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.610804 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.610845 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-config-data\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.610871 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.610911 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.610930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.610954 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.611047 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.611071 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsxqm\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-kube-api-access-hsxqm\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.611091 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.611127 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.612378 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.612828 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.614519 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.614699 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.615415 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-server-conf\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.616286 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-config-data\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.617607 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.625656 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-pod-info\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.627288 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.629581 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.634154 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsxqm\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-kube-api-access-hsxqm\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.644683 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.767617 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.769123 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.771727 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.771893 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.773577 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.774070 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.775315 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.775483 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-w2pxm" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.775838 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.776799 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.818557 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.918550 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.918610 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/29b16c62-141d-4bf5-ba4c-79590bdd39cd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.918632 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.918670 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.918827 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.918933 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/29b16c62-141d-4bf5-ba4c-79590bdd39cd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.918978 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czm9t\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-kube-api-access-czm9t\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.919059 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.919081 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.919107 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:13 crc kubenswrapper[4717]: I1007 14:10:13.919159 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.020834 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.021401 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.021437 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/29b16c62-141d-4bf5-ba4c-79590bdd39cd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.021465 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.021510 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.021538 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.021580 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/29b16c62-141d-4bf5-ba4c-79590bdd39cd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.021607 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czm9t\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-kube-api-access-czm9t\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.021650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.021676 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.021702 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.022048 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.022230 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.023211 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.023900 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.023941 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.026113 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.026400 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.026950 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/29b16c62-141d-4bf5-ba4c-79590bdd39cd-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.027409 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/29b16c62-141d-4bf5-ba4c-79590bdd39cd-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.029503 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.044498 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czm9t\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-kube-api-access-czm9t\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.053948 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.103245 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" event={"ID":"5bf5f7d6-f070-4265-aa30-c53c81a623db","Type":"ContainerStarted","Data":"963b549b200c84c9733b4840a172bd4d113ef56cf15aa6744ac6def2cdb3d9f0"} Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.104629 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" event={"ID":"35731f7d-c41a-4572-99b7-67459d507e0d","Type":"ContainerStarted","Data":"94d754d111d9b3896f9ad967a68ed28a401557c0f8ad6b5c793f8eb15451154f"} Oct 07 14:10:14 crc kubenswrapper[4717]: I1007 14:10:14.140598 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.416603 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.418050 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.424477 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.424770 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.425064 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.425242 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.425418 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-l7j6f" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.428636 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.432357 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.469086 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.470500 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.472554 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.472928 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.474414 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.476174 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-swm7x" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.476898 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.559157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eaa6cb6-249b-4f92-8942-9c60eee866e8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.559208 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eaa6cb6-249b-4f92-8942-9c60eee866e8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.559237 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.559256 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3eaa6cb6-249b-4f92-8942-9c60eee866e8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.559288 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3eaa6cb6-249b-4f92-8942-9c60eee866e8-kolla-config\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.559337 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3eaa6cb6-249b-4f92-8942-9c60eee866e8-secrets\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.559437 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3eaa6cb6-249b-4f92-8942-9c60eee866e8-config-data-default\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.559496 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5tf2\" (UniqueName: \"kubernetes.io/projected/3eaa6cb6-249b-4f92-8942-9c60eee866e8-kube-api-access-l5tf2\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.559838 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eaa6cb6-249b-4f92-8942-9c60eee866e8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662137 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662191 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662231 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eaa6cb6-249b-4f92-8942-9c60eee866e8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662321 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662387 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eaa6cb6-249b-4f92-8942-9c60eee866e8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662453 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eaa6cb6-249b-4f92-8942-9c60eee866e8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662507 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662546 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3eaa6cb6-249b-4f92-8942-9c60eee866e8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662626 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3eaa6cb6-249b-4f92-8942-9c60eee866e8-kolla-config\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662675 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3eaa6cb6-249b-4f92-8942-9c60eee866e8-secrets\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662723 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.662933 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3eaa6cb6-249b-4f92-8942-9c60eee866e8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.663350 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3eaa6cb6-249b-4f92-8942-9c60eee866e8-config-data-default\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.663420 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxg7j\" (UniqueName: \"kubernetes.io/projected/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-kube-api-access-mxg7j\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.663465 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5tf2\" (UniqueName: \"kubernetes.io/projected/3eaa6cb6-249b-4f92-8942-9c60eee866e8-kube-api-access-l5tf2\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.663530 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.663559 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.663570 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3eaa6cb6-249b-4f92-8942-9c60eee866e8-kolla-config\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.663967 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3eaa6cb6-249b-4f92-8942-9c60eee866e8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.664336 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3eaa6cb6-249b-4f92-8942-9c60eee866e8-config-data-default\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.671254 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3eaa6cb6-249b-4f92-8942-9c60eee866e8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.676435 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eaa6cb6-249b-4f92-8942-9c60eee866e8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.678102 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/3eaa6cb6-249b-4f92-8942-9c60eee866e8-secrets\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.681299 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5tf2\" (UniqueName: \"kubernetes.io/projected/3eaa6cb6-249b-4f92-8942-9c60eee866e8-kube-api-access-l5tf2\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.689632 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"3eaa6cb6-249b-4f92-8942-9c60eee866e8\") " pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.741491 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.764899 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.764954 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.765067 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.765091 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.765125 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.765147 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.765207 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.765248 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.765363 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.765674 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.765798 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.765546 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxg7j\" (UniqueName: \"kubernetes.io/projected/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-kube-api-access-mxg7j\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.766335 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.766566 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.770467 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.774586 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.777514 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.785244 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxg7j\" (UniqueName: \"kubernetes.io/projected/e3c49618-d6f6-4379-8ac1-b474b0ffdeea-kube-api-access-mxg7j\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.792291 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e3c49618-d6f6-4379-8ac1-b474b0ffdeea\") " pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.918185 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.919122 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.921583 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.921960 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qcntb" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.922225 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 07 14:10:16 crc kubenswrapper[4717]: I1007 14:10:16.929111 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.070245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a693669-0da0-46aa-a110-75593768011d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.070318 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a693669-0da0-46aa-a110-75593768011d-kolla-config\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.070366 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a693669-0da0-46aa-a110-75593768011d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.070419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a693669-0da0-46aa-a110-75593768011d-config-data\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.070958 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl7sl\" (UniqueName: \"kubernetes.io/projected/3a693669-0da0-46aa-a110-75593768011d-kube-api-access-jl7sl\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.085658 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.172410 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl7sl\" (UniqueName: \"kubernetes.io/projected/3a693669-0da0-46aa-a110-75593768011d-kube-api-access-jl7sl\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.172499 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a693669-0da0-46aa-a110-75593768011d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.172525 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a693669-0da0-46aa-a110-75593768011d-kolla-config\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.172551 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a693669-0da0-46aa-a110-75593768011d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.172574 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a693669-0da0-46aa-a110-75593768011d-config-data\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.173582 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3a693669-0da0-46aa-a110-75593768011d-config-data\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.173636 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3a693669-0da0-46aa-a110-75593768011d-kolla-config\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.176339 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a693669-0da0-46aa-a110-75593768011d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.176404 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a693669-0da0-46aa-a110-75593768011d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.187870 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl7sl\" (UniqueName: \"kubernetes.io/projected/3a693669-0da0-46aa-a110-75593768011d-kube-api-access-jl7sl\") pod \"memcached-0\" (UID: \"3a693669-0da0-46aa-a110-75593768011d\") " pod="openstack/memcached-0" Oct 07 14:10:17 crc kubenswrapper[4717]: I1007 14:10:17.243424 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 07 14:10:19 crc kubenswrapper[4717]: I1007 14:10:19.038652 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:10:19 crc kubenswrapper[4717]: I1007 14:10:19.040036 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 14:10:19 crc kubenswrapper[4717]: I1007 14:10:19.046713 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vxzfn" Oct 07 14:10:19 crc kubenswrapper[4717]: I1007 14:10:19.067653 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:10:19 crc kubenswrapper[4717]: I1007 14:10:19.204819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfw6x\" (UniqueName: \"kubernetes.io/projected/1bea9493-f1bb-4bce-8d15-f18fc71b3df1-kube-api-access-qfw6x\") pod \"kube-state-metrics-0\" (UID: \"1bea9493-f1bb-4bce-8d15-f18fc71b3df1\") " pod="openstack/kube-state-metrics-0" Oct 07 14:10:19 crc kubenswrapper[4717]: I1007 14:10:19.306350 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfw6x\" (UniqueName: \"kubernetes.io/projected/1bea9493-f1bb-4bce-8d15-f18fc71b3df1-kube-api-access-qfw6x\") pod \"kube-state-metrics-0\" (UID: \"1bea9493-f1bb-4bce-8d15-f18fc71b3df1\") " pod="openstack/kube-state-metrics-0" Oct 07 14:10:19 crc kubenswrapper[4717]: I1007 14:10:19.336955 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfw6x\" (UniqueName: \"kubernetes.io/projected/1bea9493-f1bb-4bce-8d15-f18fc71b3df1-kube-api-access-qfw6x\") pod \"kube-state-metrics-0\" (UID: \"1bea9493-f1bb-4bce-8d15-f18fc71b3df1\") " pod="openstack/kube-state-metrics-0" Oct 07 14:10:19 crc kubenswrapper[4717]: I1007 14:10:19.365307 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 14:10:21 crc kubenswrapper[4717]: I1007 14:10:21.538521 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.155259 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vr6v8"] Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.180145 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.183444 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-q4nhk" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.183567 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vr6v8"] Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.184992 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.185248 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.205574 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-wvknm"] Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.207712 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.219660 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wvknm"] Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356437 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gjql\" (UniqueName: \"kubernetes.io/projected/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-kube-api-access-2gjql\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356495 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-var-run\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356527 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-scripts\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356579 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-var-log-ovn\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356615 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-etc-ovs\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356642 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-scripts\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356663 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm6xp\" (UniqueName: \"kubernetes.io/projected/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-kube-api-access-jm6xp\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356684 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-var-lib\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356716 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-combined-ca-bundle\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356738 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-var-run\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356766 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-var-run-ovn\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356795 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-ovn-controller-tls-certs\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.356821 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-var-log\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.459864 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-ovn-controller-tls-certs\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.459910 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-var-log\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.459958 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gjql\" (UniqueName: \"kubernetes.io/projected/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-kube-api-access-2gjql\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.459978 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-var-run\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.459992 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-scripts\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460059 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-var-log-ovn\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460081 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-etc-ovs\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460100 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-scripts\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460121 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm6xp\" (UniqueName: \"kubernetes.io/projected/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-kube-api-access-jm6xp\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460141 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-var-lib\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460171 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-combined-ca-bundle\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460189 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-var-run\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460209 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-var-run-ovn\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460396 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-var-log\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460506 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-var-run\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460594 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-var-run-ovn\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460778 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-etc-ovs\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.460956 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-var-log-ovn\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.461085 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-var-lib\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.461145 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-var-run\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.462247 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-scripts\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.463232 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-scripts\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.467460 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-ovn-controller-tls-certs\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.481574 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gjql\" (UniqueName: \"kubernetes.io/projected/1c9dfdb0-e3ce-4c80-903f-006f20eacf29-kube-api-access-2gjql\") pod \"ovn-controller-ovs-wvknm\" (UID: \"1c9dfdb0-e3ce-4c80-903f-006f20eacf29\") " pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.483638 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-combined-ca-bundle\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.485903 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm6xp\" (UniqueName: \"kubernetes.io/projected/f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c-kube-api-access-jm6xp\") pod \"ovn-controller-vr6v8\" (UID: \"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c\") " pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.501567 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.528617 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.583415 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.584837 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.588094 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.588293 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.588355 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.588601 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.588799 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wznsp" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.604799 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.764480 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/272dcaf8-29ce-4329-8301-4123eea773dc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.764580 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272dcaf8-29ce-4329-8301-4123eea773dc-config\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.764616 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/272dcaf8-29ce-4329-8301-4123eea773dc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.764640 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272dcaf8-29ce-4329-8301-4123eea773dc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.764693 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/272dcaf8-29ce-4329-8301-4123eea773dc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.764713 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/272dcaf8-29ce-4329-8301-4123eea773dc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.765066 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.765192 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwc57\" (UniqueName: \"kubernetes.io/projected/272dcaf8-29ce-4329-8301-4123eea773dc-kube-api-access-mwc57\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.866996 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.867060 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwc57\" (UniqueName: \"kubernetes.io/projected/272dcaf8-29ce-4329-8301-4123eea773dc-kube-api-access-mwc57\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.867095 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/272dcaf8-29ce-4329-8301-4123eea773dc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.867129 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272dcaf8-29ce-4329-8301-4123eea773dc-config\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.867149 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/272dcaf8-29ce-4329-8301-4123eea773dc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.867166 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272dcaf8-29ce-4329-8301-4123eea773dc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.867185 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/272dcaf8-29ce-4329-8301-4123eea773dc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.867200 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/272dcaf8-29ce-4329-8301-4123eea773dc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.867306 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.867814 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/272dcaf8-29ce-4329-8301-4123eea773dc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.868581 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272dcaf8-29ce-4329-8301-4123eea773dc-config\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.868861 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/272dcaf8-29ce-4329-8301-4123eea773dc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.873074 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/272dcaf8-29ce-4329-8301-4123eea773dc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.873339 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/272dcaf8-29ce-4329-8301-4123eea773dc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.881614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/272dcaf8-29ce-4329-8301-4123eea773dc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.892744 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwc57\" (UniqueName: \"kubernetes.io/projected/272dcaf8-29ce-4329-8301-4123eea773dc-kube-api-access-mwc57\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.904800 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"272dcaf8-29ce-4329-8301-4123eea773dc\") " pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:22 crc kubenswrapper[4717]: I1007 14:10:22.928879 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:25 crc kubenswrapper[4717]: W1007 14:10:25.964678 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82faaf9f_edd1_4ea3_85f9_8b359fbd99a2.slice/crio-fef88196478f534017c6ef0d5bfc8001904b391847f0c33e2ceff9f20fc47ae8 WatchSource:0}: Error finding container fef88196478f534017c6ef0d5bfc8001904b391847f0c33e2ceff9f20fc47ae8: Status 404 returned error can't find the container with id fef88196478f534017c6ef0d5bfc8001904b391847f0c33e2ceff9f20fc47ae8 Oct 07 14:10:26 crc kubenswrapper[4717]: I1007 14:10:26.213809 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2","Type":"ContainerStarted","Data":"fef88196478f534017c6ef0d5bfc8001904b391847f0c33e2ceff9f20fc47ae8"} Oct 07 14:10:26 crc kubenswrapper[4717]: E1007 14:10:26.734892 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 14:10:26 crc kubenswrapper[4717]: E1007 14:10:26.735259 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t94lb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-rczth_openstack(35748342-2d62-4655-8602-8467bfc1b3c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 14:10:26 crc kubenswrapper[4717]: E1007 14:10:26.736535 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" podUID="35748342-2d62-4655-8602-8467bfc1b3c4" Oct 07 14:10:26 crc kubenswrapper[4717]: E1007 14:10:26.777841 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 07 14:10:26 crc kubenswrapper[4717]: E1007 14:10:26.778000 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hq57z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-vx6f6_openstack(cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 14:10:26 crc kubenswrapper[4717]: E1007 14:10:26.779358 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" podUID="cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0" Oct 07 14:10:26 crc kubenswrapper[4717]: I1007 14:10:26.905935 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 14:10:26 crc kubenswrapper[4717]: I1007 14:10:26.907323 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:26 crc kubenswrapper[4717]: I1007 14:10:26.912430 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 07 14:10:26 crc kubenswrapper[4717]: I1007 14:10:26.912443 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-h9hdc" Oct 07 14:10:26 crc kubenswrapper[4717]: I1007 14:10:26.912811 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 07 14:10:26 crc kubenswrapper[4717]: I1007 14:10:26.913081 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 07 14:10:26 crc kubenswrapper[4717]: I1007 14:10:26.925569 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.049092 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d203dc4-3d0b-4e7c-b38b-96f231f12071-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.049137 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d203dc4-3d0b-4e7c-b38b-96f231f12071-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.049156 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8hf7\" (UniqueName: \"kubernetes.io/projected/2d203dc4-3d0b-4e7c-b38b-96f231f12071-kube-api-access-r8hf7\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.049339 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d203dc4-3d0b-4e7c-b38b-96f231f12071-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.049477 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d203dc4-3d0b-4e7c-b38b-96f231f12071-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.049533 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d203dc4-3d0b-4e7c-b38b-96f231f12071-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.049549 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d203dc4-3d0b-4e7c-b38b-96f231f12071-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.049578 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.151058 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d203dc4-3d0b-4e7c-b38b-96f231f12071-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.151126 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d203dc4-3d0b-4e7c-b38b-96f231f12071-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.151149 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d203dc4-3d0b-4e7c-b38b-96f231f12071-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.151178 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.151236 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d203dc4-3d0b-4e7c-b38b-96f231f12071-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.151264 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d203dc4-3d0b-4e7c-b38b-96f231f12071-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.151290 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8hf7\" (UniqueName: \"kubernetes.io/projected/2d203dc4-3d0b-4e7c-b38b-96f231f12071-kube-api-access-r8hf7\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.151347 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d203dc4-3d0b-4e7c-b38b-96f231f12071-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.152430 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.152463 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d203dc4-3d0b-4e7c-b38b-96f231f12071-config\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.152543 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2d203dc4-3d0b-4e7c-b38b-96f231f12071-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.153308 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2d203dc4-3d0b-4e7c-b38b-96f231f12071-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.158100 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d203dc4-3d0b-4e7c-b38b-96f231f12071-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.160376 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d203dc4-3d0b-4e7c-b38b-96f231f12071-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.168374 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d203dc4-3d0b-4e7c-b38b-96f231f12071-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.171255 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8hf7\" (UniqueName: \"kubernetes.io/projected/2d203dc4-3d0b-4e7c-b38b-96f231f12071-kube-api-access-r8hf7\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.172494 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2d203dc4-3d0b-4e7c-b38b-96f231f12071\") " pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.228977 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.250865 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 14:10:27 crc kubenswrapper[4717]: W1007 14:10:27.310760 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29b16c62_141d_4bf5_ba4c_79590bdd39cd.slice/crio-7846d39f651bd4779ca7d799ac96ddf741a592cf173d69a34928ba0c596ed704 WatchSource:0}: Error finding container 7846d39f651bd4779ca7d799ac96ddf741a592cf173d69a34928ba0c596ed704: Status 404 returned error can't find the container with id 7846d39f651bd4779ca7d799ac96ddf741a592cf173d69a34928ba0c596ed704 Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.370196 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.382466 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.646791 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.652168 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.704028 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.712921 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.730495 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vr6v8"] Oct 07 14:10:27 crc kubenswrapper[4717]: W1007 14:10:27.730506 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bea9493_f1bb_4bce_8d15_f18fc71b3df1.slice/crio-a73447455ba02d450fcb9261323de3d993de56b49a2c2e61ff7aecb78c179201 WatchSource:0}: Error finding container a73447455ba02d450fcb9261323de3d993de56b49a2c2e61ff7aecb78c179201: Status 404 returned error can't find the container with id a73447455ba02d450fcb9261323de3d993de56b49a2c2e61ff7aecb78c179201 Oct 07 14:10:27 crc kubenswrapper[4717]: W1007 14:10:27.733492 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9c414ed_d3f6_42cc_8ab0_14eab36e7d0c.slice/crio-a02304e34cd0205c89bf19a187c7c3a86ef6b924d567a0516ac195ef1f8c0c7d WatchSource:0}: Error finding container a02304e34cd0205c89bf19a187c7c3a86ef6b924d567a0516ac195ef1f8c0c7d: Status 404 returned error can't find the container with id a02304e34cd0205c89bf19a187c7c3a86ef6b924d567a0516ac195ef1f8c0c7d Oct 07 14:10:27 crc kubenswrapper[4717]: W1007 14:10:27.739922 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c49618_d6f6_4379_8ac1_b474b0ffdeea.slice/crio-e229c1301aacc91308431b280ebef6d4695ac3fa95f4ec4a22dc91b52b9e6f2b WatchSource:0}: Error finding container e229c1301aacc91308431b280ebef6d4695ac3fa95f4ec4a22dc91b52b9e6f2b: Status 404 returned error can't find the container with id e229c1301aacc91308431b280ebef6d4695ac3fa95f4ec4a22dc91b52b9e6f2b Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.762741 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq57z\" (UniqueName: \"kubernetes.io/projected/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-kube-api-access-hq57z\") pod \"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0\" (UID: \"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0\") " Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.762864 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-config\") pod \"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0\" (UID: \"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0\") " Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.762924 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-dns-svc\") pod \"35748342-2d62-4655-8602-8467bfc1b3c4\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.762954 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-config\") pod \"35748342-2d62-4655-8602-8467bfc1b3c4\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.762978 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t94lb\" (UniqueName: \"kubernetes.io/projected/35748342-2d62-4655-8602-8467bfc1b3c4-kube-api-access-t94lb\") pod \"35748342-2d62-4655-8602-8467bfc1b3c4\" (UID: \"35748342-2d62-4655-8602-8467bfc1b3c4\") " Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.764758 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-config" (OuterVolumeSpecName: "config") pod "35748342-2d62-4655-8602-8467bfc1b3c4" (UID: "35748342-2d62-4655-8602-8467bfc1b3c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.764812 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-config" (OuterVolumeSpecName: "config") pod "cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0" (UID: "cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.765227 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "35748342-2d62-4655-8602-8467bfc1b3c4" (UID: "35748342-2d62-4655-8602-8467bfc1b3c4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.768928 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35748342-2d62-4655-8602-8467bfc1b3c4-kube-api-access-t94lb" (OuterVolumeSpecName: "kube-api-access-t94lb") pod "35748342-2d62-4655-8602-8467bfc1b3c4" (UID: "35748342-2d62-4655-8602-8467bfc1b3c4"). InnerVolumeSpecName "kube-api-access-t94lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.770171 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-kube-api-access-hq57z" (OuterVolumeSpecName: "kube-api-access-hq57z") pod "cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0" (UID: "cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0"). InnerVolumeSpecName "kube-api-access-hq57z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.853396 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 07 14:10:27 crc kubenswrapper[4717]: W1007 14:10:27.853863 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod272dcaf8_29ce_4329_8301_4123eea773dc.slice/crio-f97da59df9052fc617b8c4e4061eda921fdb7f4f50580d35d1d40e78b924c0ef WatchSource:0}: Error finding container f97da59df9052fc617b8c4e4061eda921fdb7f4f50580d35d1d40e78b924c0ef: Status 404 returned error can't find the container with id f97da59df9052fc617b8c4e4061eda921fdb7f4f50580d35d1d40e78b924c0ef Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.864832 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.864871 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.864885 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35748342-2d62-4655-8602-8467bfc1b3c4-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.864898 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t94lb\" (UniqueName: \"kubernetes.io/projected/35748342-2d62-4655-8602-8467bfc1b3c4-kube-api-access-t94lb\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:27 crc kubenswrapper[4717]: I1007 14:10:27.864912 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq57z\" (UniqueName: \"kubernetes.io/projected/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0-kube-api-access-hq57z\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:28 crc kubenswrapper[4717]: W1007 14:10:28.021060 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d203dc4_3d0b_4e7c_b38b_96f231f12071.slice/crio-f1a367da80e9352368f5c8e8946400fcc2217ddd4656a003693f632db4e9bd0b WatchSource:0}: Error finding container f1a367da80e9352368f5c8e8946400fcc2217ddd4656a003693f632db4e9bd0b: Status 404 returned error can't find the container with id f1a367da80e9352368f5c8e8946400fcc2217ddd4656a003693f632db4e9bd0b Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.024764 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.251121 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3c49618-d6f6-4379-8ac1-b474b0ffdeea","Type":"ContainerStarted","Data":"e229c1301aacc91308431b280ebef6d4695ac3fa95f4ec4a22dc91b52b9e6f2b"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.252233 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"272dcaf8-29ce-4329-8301-4123eea773dc","Type":"ContainerStarted","Data":"f97da59df9052fc617b8c4e4061eda921fdb7f4f50580d35d1d40e78b924c0ef"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.253549 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1bea9493-f1bb-4bce-8d15-f18fc71b3df1","Type":"ContainerStarted","Data":"a73447455ba02d450fcb9261323de3d993de56b49a2c2e61ff7aecb78c179201"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.254637 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3a693669-0da0-46aa-a110-75593768011d","Type":"ContainerStarted","Data":"a63d0cc55a1bc3e1020babc5654900247e1eac1cb75f859e44c49f60cf0a75a6"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.255962 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d203dc4-3d0b-4e7c-b38b-96f231f12071","Type":"ContainerStarted","Data":"f1a367da80e9352368f5c8e8946400fcc2217ddd4656a003693f632db4e9bd0b"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.257278 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"29b16c62-141d-4bf5-ba4c-79590bdd39cd","Type":"ContainerStarted","Data":"7846d39f651bd4779ca7d799ac96ddf741a592cf173d69a34928ba0c596ed704"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.259436 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" event={"ID":"cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0","Type":"ContainerDied","Data":"c33d5a322124185711f16ad92ded8c3b49cc01f391d6b2005264e0d374f046ff"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.259747 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-vx6f6" Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.264255 4717 generic.go:334] "Generic (PLEG): container finished" podID="35731f7d-c41a-4572-99b7-67459d507e0d" containerID="1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2" exitCode=0 Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.264357 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" event={"ID":"35731f7d-c41a-4572-99b7-67459d507e0d","Type":"ContainerDied","Data":"1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.267284 4717 generic.go:334] "Generic (PLEG): container finished" podID="5bf5f7d6-f070-4265-aa30-c53c81a623db" containerID="f8da64cd006dd4cae567ac55a59baa5f3b409b95c88ae90d323ae06c1b0b49ca" exitCode=0 Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.267372 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" event={"ID":"5bf5f7d6-f070-4265-aa30-c53c81a623db","Type":"ContainerDied","Data":"f8da64cd006dd4cae567ac55a59baa5f3b409b95c88ae90d323ae06c1b0b49ca"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.270038 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3eaa6cb6-249b-4f92-8942-9c60eee866e8","Type":"ContainerStarted","Data":"ac82496fd910ff339c04c9f3d054e49b2557ee9949bcbdf90de950c055e8c853"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.271374 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vr6v8" event={"ID":"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c","Type":"ContainerStarted","Data":"a02304e34cd0205c89bf19a187c7c3a86ef6b924d567a0516ac195ef1f8c0c7d"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.272529 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" event={"ID":"35748342-2d62-4655-8602-8467bfc1b3c4","Type":"ContainerDied","Data":"08085a681e86b6a02a20ed6dbe4a84144282b4aa600437433e023ff76d72c9e9"} Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.272583 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-rczth" Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.355841 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vx6f6"] Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.371978 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-vx6f6"] Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.403635 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rczth"] Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.411464 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-rczth"] Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.641687 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wvknm"] Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.879879 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35748342-2d62-4655-8602-8467bfc1b3c4" path="/var/lib/kubelet/pods/35748342-2d62-4655-8602-8467bfc1b3c4/volumes" Oct 07 14:10:28 crc kubenswrapper[4717]: I1007 14:10:28.880355 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0" path="/var/lib/kubelet/pods/cc9cb823-f01d-4fe5-9c9b-733eb1e17ca0/volumes" Oct 07 14:10:29 crc kubenswrapper[4717]: W1007 14:10:29.826217 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c9dfdb0_e3ce_4c80_903f_006f20eacf29.slice/crio-49b9534f1d999eb12e57c5d860e39a186cf7c7d6787830e157164c01c25c6bf9 WatchSource:0}: Error finding container 49b9534f1d999eb12e57c5d860e39a186cf7c7d6787830e157164c01c25c6bf9: Status 404 returned error can't find the container with id 49b9534f1d999eb12e57c5d860e39a186cf7c7d6787830e157164c01c25c6bf9 Oct 07 14:10:30 crc kubenswrapper[4717]: I1007 14:10:30.286648 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvknm" event={"ID":"1c9dfdb0-e3ce-4c80-903f-006f20eacf29","Type":"ContainerStarted","Data":"49b9534f1d999eb12e57c5d860e39a186cf7c7d6787830e157164c01c25c6bf9"} Oct 07 14:10:35 crc kubenswrapper[4717]: I1007 14:10:35.320809 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" event={"ID":"5bf5f7d6-f070-4265-aa30-c53c81a623db","Type":"ContainerStarted","Data":"d8bb6339a49cfcd397eeb46939db5d72dbc005017c4fed2428fcbed36b8c7895"} Oct 07 14:10:35 crc kubenswrapper[4717]: I1007 14:10:35.321305 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:35 crc kubenswrapper[4717]: I1007 14:10:35.322931 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3a693669-0da0-46aa-a110-75593768011d","Type":"ContainerStarted","Data":"7ce2290efec01c0cce23e4e0be016aac5f9889674d5e1c4a62628e666c9cc3eb"} Oct 07 14:10:35 crc kubenswrapper[4717]: I1007 14:10:35.323039 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 07 14:10:35 crc kubenswrapper[4717]: I1007 14:10:35.324324 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" event={"ID":"35731f7d-c41a-4572-99b7-67459d507e0d","Type":"ContainerStarted","Data":"eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4"} Oct 07 14:10:35 crc kubenswrapper[4717]: I1007 14:10:35.324468 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:35 crc kubenswrapper[4717]: I1007 14:10:35.347674 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" podStartSLOduration=9.944581671 podStartE2EDuration="23.347653038s" podCreationTimestamp="2025-10-07 14:10:12 +0000 UTC" firstStartedPulling="2025-10-07 14:10:13.525033972 +0000 UTC m=+995.352959764" lastFinishedPulling="2025-10-07 14:10:26.928105339 +0000 UTC m=+1008.756031131" observedRunningTime="2025-10-07 14:10:35.341484937 +0000 UTC m=+1017.169410729" watchObservedRunningTime="2025-10-07 14:10:35.347653038 +0000 UTC m=+1017.175578830" Oct 07 14:10:35 crc kubenswrapper[4717]: I1007 14:10:35.368743 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.010310502 podStartE2EDuration="19.368724811s" podCreationTimestamp="2025-10-07 14:10:16 +0000 UTC" firstStartedPulling="2025-10-07 14:10:27.406958829 +0000 UTC m=+1009.234884621" lastFinishedPulling="2025-10-07 14:10:33.765373138 +0000 UTC m=+1015.593298930" observedRunningTime="2025-10-07 14:10:35.363874337 +0000 UTC m=+1017.191800129" watchObservedRunningTime="2025-10-07 14:10:35.368724811 +0000 UTC m=+1017.196650603" Oct 07 14:10:35 crc kubenswrapper[4717]: I1007 14:10:35.384826 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" podStartSLOduration=9.652366602 podStartE2EDuration="23.384807457s" podCreationTimestamp="2025-10-07 14:10:12 +0000 UTC" firstStartedPulling="2025-10-07 14:10:13.19949091 +0000 UTC m=+995.027416702" lastFinishedPulling="2025-10-07 14:10:26.931931765 +0000 UTC m=+1008.759857557" observedRunningTime="2025-10-07 14:10:35.378323937 +0000 UTC m=+1017.206249729" watchObservedRunningTime="2025-10-07 14:10:35.384807457 +0000 UTC m=+1017.212733249" Oct 07 14:10:36 crc kubenswrapper[4717]: I1007 14:10:36.334811 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3eaa6cb6-249b-4f92-8942-9c60eee866e8","Type":"ContainerStarted","Data":"1424b2c8910cea1aa9dd0432a02cceb90b21b99703197f1565dad1e28658f8ba"} Oct 07 14:10:37 crc kubenswrapper[4717]: I1007 14:10:37.346460 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"29b16c62-141d-4bf5-ba4c-79590bdd39cd","Type":"ContainerStarted","Data":"610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3"} Oct 07 14:10:37 crc kubenswrapper[4717]: I1007 14:10:37.348829 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2","Type":"ContainerStarted","Data":"08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b"} Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.358195 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3c49618-d6f6-4379-8ac1-b474b0ffdeea","Type":"ContainerStarted","Data":"d291581a05225ed68df2e51234c715a0a56d9310bc74adeb2f6aa78bce94150d"} Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.359913 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"272dcaf8-29ce-4329-8301-4123eea773dc","Type":"ContainerStarted","Data":"245402347ca81e427397201d42da167fa6e73223b1efa5c1417167b08df644e3"} Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.361682 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1bea9493-f1bb-4bce-8d15-f18fc71b3df1","Type":"ContainerStarted","Data":"0f2d92a701650746410e1d309e2fc45a1b827ddb259fe6416a7b240e21878aeb"} Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.361887 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.363559 4717 generic.go:334] "Generic (PLEG): container finished" podID="1c9dfdb0-e3ce-4c80-903f-006f20eacf29" containerID="43b50ef634960374ae3715c1ba392551d1d17d1b3a609ae591fe264747508c72" exitCode=0 Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.363616 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvknm" event={"ID":"1c9dfdb0-e3ce-4c80-903f-006f20eacf29","Type":"ContainerDied","Data":"43b50ef634960374ae3715c1ba392551d1d17d1b3a609ae591fe264747508c72"} Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.365115 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vr6v8" event={"ID":"f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c","Type":"ContainerStarted","Data":"24ec36e27db3fdb7de365b6b65640d6603eb3794c185085c1f4c9f5004e4ce9b"} Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.365235 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vr6v8" Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.367706 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d203dc4-3d0b-4e7c-b38b-96f231f12071","Type":"ContainerStarted","Data":"c61e1c79f01650a39b3e5cdf137c32a5b8fd3829ef804cb9048926a90035fa2d"} Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.423909 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.569265781 podStartE2EDuration="19.423892669s" podCreationTimestamp="2025-10-07 14:10:19 +0000 UTC" firstStartedPulling="2025-10-07 14:10:27.733977932 +0000 UTC m=+1009.561903724" lastFinishedPulling="2025-10-07 14:10:37.58860482 +0000 UTC m=+1019.416530612" observedRunningTime="2025-10-07 14:10:38.421199654 +0000 UTC m=+1020.249125446" watchObservedRunningTime="2025-10-07 14:10:38.423892669 +0000 UTC m=+1020.251818461" Oct 07 14:10:38 crc kubenswrapper[4717]: I1007 14:10:38.440964 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vr6v8" podStartSLOduration=9.204556212 podStartE2EDuration="16.440946322s" podCreationTimestamp="2025-10-07 14:10:22 +0000 UTC" firstStartedPulling="2025-10-07 14:10:27.735227787 +0000 UTC m=+1009.563153579" lastFinishedPulling="2025-10-07 14:10:34.971617897 +0000 UTC m=+1016.799543689" observedRunningTime="2025-10-07 14:10:38.438968887 +0000 UTC m=+1020.266894669" watchObservedRunningTime="2025-10-07 14:10:38.440946322 +0000 UTC m=+1020.268872114" Oct 07 14:10:39 crc kubenswrapper[4717]: I1007 14:10:39.379597 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvknm" event={"ID":"1c9dfdb0-e3ce-4c80-903f-006f20eacf29","Type":"ContainerStarted","Data":"2815e2d06ca2c5086a1c297aaef750395138c2649da5d1ce02caba2439115961"} Oct 07 14:10:39 crc kubenswrapper[4717]: I1007 14:10:39.380475 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wvknm" event={"ID":"1c9dfdb0-e3ce-4c80-903f-006f20eacf29","Type":"ContainerStarted","Data":"c7ae3c7b354e0a5729c9413e774dcc140e6d89115d3d9dcdf981f3dc52d5ca1b"} Oct 07 14:10:40 crc kubenswrapper[4717]: I1007 14:10:40.385760 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:40 crc kubenswrapper[4717]: I1007 14:10:40.385807 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:10:41 crc kubenswrapper[4717]: I1007 14:10:41.396142 4717 generic.go:334] "Generic (PLEG): container finished" podID="3eaa6cb6-249b-4f92-8942-9c60eee866e8" containerID="1424b2c8910cea1aa9dd0432a02cceb90b21b99703197f1565dad1e28658f8ba" exitCode=0 Oct 07 14:10:41 crc kubenswrapper[4717]: I1007 14:10:41.396350 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3eaa6cb6-249b-4f92-8942-9c60eee866e8","Type":"ContainerDied","Data":"1424b2c8910cea1aa9dd0432a02cceb90b21b99703197f1565dad1e28658f8ba"} Oct 07 14:10:41 crc kubenswrapper[4717]: I1007 14:10:41.398562 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2d203dc4-3d0b-4e7c-b38b-96f231f12071","Type":"ContainerStarted","Data":"18965d00dd2b17310a6aeedbdfb8f8483a905ee478f032f2c843ebbe0b08d3ad"} Oct 07 14:10:41 crc kubenswrapper[4717]: I1007 14:10:41.415281 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"272dcaf8-29ce-4329-8301-4123eea773dc","Type":"ContainerStarted","Data":"aa229711dde9eccf52ae2ff60c2084bf843dfc52798c76d003c1d313084ae4e3"} Oct 07 14:10:41 crc kubenswrapper[4717]: I1007 14:10:41.422542 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-wvknm" podStartSLOduration=14.571819552000001 podStartE2EDuration="19.422522669s" podCreationTimestamp="2025-10-07 14:10:22 +0000 UTC" firstStartedPulling="2025-10-07 14:10:29.830559713 +0000 UTC m=+1011.658485505" lastFinishedPulling="2025-10-07 14:10:34.68126283 +0000 UTC m=+1016.509188622" observedRunningTime="2025-10-07 14:10:39.401914682 +0000 UTC m=+1021.229840474" watchObservedRunningTime="2025-10-07 14:10:41.422522669 +0000 UTC m=+1023.250448471" Oct 07 14:10:41 crc kubenswrapper[4717]: I1007 14:10:41.444198 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.618442553 podStartE2EDuration="16.444179659s" podCreationTimestamp="2025-10-07 14:10:25 +0000 UTC" firstStartedPulling="2025-10-07 14:10:28.029950364 +0000 UTC m=+1009.857876166" lastFinishedPulling="2025-10-07 14:10:40.85568748 +0000 UTC m=+1022.683613272" observedRunningTime="2025-10-07 14:10:41.439696755 +0000 UTC m=+1023.267622547" watchObservedRunningTime="2025-10-07 14:10:41.444179659 +0000 UTC m=+1023.272105451" Oct 07 14:10:41 crc kubenswrapper[4717]: I1007 14:10:41.466394 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.475221424 podStartE2EDuration="20.466370064s" podCreationTimestamp="2025-10-07 14:10:21 +0000 UTC" firstStartedPulling="2025-10-07 14:10:27.857386082 +0000 UTC m=+1009.685311874" lastFinishedPulling="2025-10-07 14:10:40.848534722 +0000 UTC m=+1022.676460514" observedRunningTime="2025-10-07 14:10:41.458517077 +0000 UTC m=+1023.286442869" watchObservedRunningTime="2025-10-07 14:10:41.466370064 +0000 UTC m=+1023.294295856" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.230339 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.230641 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.245061 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.280036 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.426072 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3eaa6cb6-249b-4f92-8942-9c60eee866e8","Type":"ContainerStarted","Data":"54f5deeef4c0ebbdba3f3676bfc1d80c30773b7b5e1470bdb45d75509ad9ff72"} Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.428498 4717 generic.go:334] "Generic (PLEG): container finished" podID="e3c49618-d6f6-4379-8ac1-b474b0ffdeea" containerID="d291581a05225ed68df2e51234c715a0a56d9310bc74adeb2f6aa78bce94150d" exitCode=0 Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.428544 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3c49618-d6f6-4379-8ac1-b474b0ffdeea","Type":"ContainerDied","Data":"d291581a05225ed68df2e51234c715a0a56d9310bc74adeb2f6aa78bce94150d"} Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.462342 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.490061544 podStartE2EDuration="27.462322045s" podCreationTimestamp="2025-10-07 14:10:15 +0000 UTC" firstStartedPulling="2025-10-07 14:10:27.406462505 +0000 UTC m=+1009.234388297" lastFinishedPulling="2025-10-07 14:10:34.378722996 +0000 UTC m=+1016.206648798" observedRunningTime="2025-10-07 14:10:42.457420729 +0000 UTC m=+1024.285346521" watchObservedRunningTime="2025-10-07 14:10:42.462322045 +0000 UTC m=+1024.290247837" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.493996 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.630162 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.815462 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dl9s5"] Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.815751 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" podUID="5bf5f7d6-f070-4265-aa30-c53c81a623db" containerName="dnsmasq-dns" containerID="cri-o://d8bb6339a49cfcd397eeb46939db5d72dbc005017c4fed2428fcbed36b8c7895" gracePeriod=10 Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.818169 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.860522 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hgk6p"] Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.862202 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.864133 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.879368 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hgk6p"] Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.929809 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.946674 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-config\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.946737 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.946876 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbgcj\" (UniqueName: \"kubernetes.io/projected/4179936c-69ae-4c85-bebd-50ce6531c81f-kube-api-access-cbgcj\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.947329 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.974841 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-lh5mj"] Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.976025 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.978684 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 07 14:10:42 crc kubenswrapper[4717]: I1007 14:10:42.987890 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lh5mj"] Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.023819 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" podUID="5bf5f7d6-f070-4265-aa30-c53c81a623db" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: connect: connection refused" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.048554 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.048654 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mgbd\" (UniqueName: \"kubernetes.io/projected/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-kube-api-access-9mgbd\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.048686 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbgcj\" (UniqueName: \"kubernetes.io/projected/4179936c-69ae-4c85-bebd-50ce6531c81f-kube-api-access-cbgcj\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.048719 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-ovs-rundir\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.048755 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-ovn-rundir\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.048790 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-config\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.048873 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.048907 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.048938 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-combined-ca-bundle\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.049018 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-config\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.049739 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.049939 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-config\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.050241 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.073976 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbgcj\" (UniqueName: \"kubernetes.io/projected/4179936c-69ae-4c85-bebd-50ce6531c81f-kube-api-access-cbgcj\") pod \"dnsmasq-dns-7f896c8c65-hgk6p\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.138175 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hgk6p"] Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.138854 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.150247 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mgbd\" (UniqueName: \"kubernetes.io/projected/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-kube-api-access-9mgbd\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.150536 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-ovs-rundir\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.150691 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-ovn-rundir\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.150805 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-config\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.150895 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-ovs-rundir\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.150941 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-ovn-rundir\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.151063 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.151192 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-combined-ca-bundle\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.152055 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-config\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.154414 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.165811 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-combined-ca-bundle\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.168028 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-h8d4n"] Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.169247 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.178684 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.184271 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mgbd\" (UniqueName: \"kubernetes.io/projected/190219f8-b7b5-4cbd-ab5b-3fd1880f9eef-kube-api-access-9mgbd\") pod \"ovn-controller-metrics-lh5mj\" (UID: \"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef\") " pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.189665 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-h8d4n"] Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.257623 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srfjr\" (UniqueName: \"kubernetes.io/projected/8c7a43c4-a720-4699-8343-3daeebaef9c2-kube-api-access-srfjr\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.257760 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-config\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.257980 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.258034 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.258060 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.291610 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-lh5mj" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.360085 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srfjr\" (UniqueName: \"kubernetes.io/projected/8c7a43c4-a720-4699-8343-3daeebaef9c2-kube-api-access-srfjr\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.360169 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-config\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.360269 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.360309 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.360334 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.361364 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.362412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-config\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.366421 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.371810 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.385164 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srfjr\" (UniqueName: \"kubernetes.io/projected/8c7a43c4-a720-4699-8343-3daeebaef9c2-kube-api-access-srfjr\") pod \"dnsmasq-dns-86db49b7ff-h8d4n\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.435172 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hgk6p"] Oct 07 14:10:43 crc kubenswrapper[4717]: W1007 14:10:43.441043 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4179936c_69ae_4c85_bebd_50ce6531c81f.slice/crio-66d2e8c0eecf72284c13902f5e325efda38e11ef9f695c12d8d333c4a799bfb1 WatchSource:0}: Error finding container 66d2e8c0eecf72284c13902f5e325efda38e11ef9f695c12d8d333c4a799bfb1: Status 404 returned error can't find the container with id 66d2e8c0eecf72284c13902f5e325efda38e11ef9f695c12d8d333c4a799bfb1 Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.449777 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e3c49618-d6f6-4379-8ac1-b474b0ffdeea","Type":"ContainerStarted","Data":"5b7f4090d80b86a7048f32b9450f3d9bafa55a052e0a5ed8cf9595b163ef2d59"} Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.456378 4717 generic.go:334] "Generic (PLEG): container finished" podID="5bf5f7d6-f070-4265-aa30-c53c81a623db" containerID="d8bb6339a49cfcd397eeb46939db5d72dbc005017c4fed2428fcbed36b8c7895" exitCode=0 Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.456719 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" event={"ID":"5bf5f7d6-f070-4265-aa30-c53c81a623db","Type":"ContainerDied","Data":"d8bb6339a49cfcd397eeb46939db5d72dbc005017c4fed2428fcbed36b8c7895"} Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.486480 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.56992838 podStartE2EDuration="28.486462956s" podCreationTimestamp="2025-10-07 14:10:15 +0000 UTC" firstStartedPulling="2025-10-07 14:10:27.743952588 +0000 UTC m=+1009.571878380" lastFinishedPulling="2025-10-07 14:10:34.660487164 +0000 UTC m=+1016.488412956" observedRunningTime="2025-10-07 14:10:43.479703229 +0000 UTC m=+1025.307629021" watchObservedRunningTime="2025-10-07 14:10:43.486462956 +0000 UTC m=+1025.314388748" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.533116 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.781275 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-lh5mj"] Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.939869 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:43 crc kubenswrapper[4717]: I1007 14:10:43.973933 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-h8d4n"] Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.022902 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.464602 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" event={"ID":"4179936c-69ae-4c85-bebd-50ce6531c81f","Type":"ContainerStarted","Data":"66d2e8c0eecf72284c13902f5e325efda38e11ef9f695c12d8d333c4a799bfb1"} Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.465900 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lh5mj" event={"ID":"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef","Type":"ContainerStarted","Data":"fb53f8950c32d139c32b2621dc13533deeddff17de66fac786cfe2b7021b3368"} Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.466705 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" event={"ID":"8c7a43c4-a720-4699-8343-3daeebaef9c2","Type":"ContainerStarted","Data":"0ecaa712a02585d2b724d3f96591e777d34aa77f0d06a8f4ed3bcdc11befe564"} Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.520558 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.684592 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.686534 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.688564 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.689386 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.690503 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.692486 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qrslx" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.698115 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.815975 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57dca108-f9e5-443a-aa97-01263cf96863-config\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.816046 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plbh7\" (UniqueName: \"kubernetes.io/projected/57dca108-f9e5-443a-aa97-01263cf96863-kube-api-access-plbh7\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.816083 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/57dca108-f9e5-443a-aa97-01263cf96863-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.816121 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57dca108-f9e5-443a-aa97-01263cf96863-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.816141 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57dca108-f9e5-443a-aa97-01263cf96863-scripts\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.816161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/57dca108-f9e5-443a-aa97-01263cf96863-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.816187 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/57dca108-f9e5-443a-aa97-01263cf96863-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.920400 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/57dca108-f9e5-443a-aa97-01263cf96863-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.920906 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57dca108-f9e5-443a-aa97-01263cf96863-config\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.921110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plbh7\" (UniqueName: \"kubernetes.io/projected/57dca108-f9e5-443a-aa97-01263cf96863-kube-api-access-plbh7\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.921279 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/57dca108-f9e5-443a-aa97-01263cf96863-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.921423 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/57dca108-f9e5-443a-aa97-01263cf96863-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.921593 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57dca108-f9e5-443a-aa97-01263cf96863-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.921710 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57dca108-f9e5-443a-aa97-01263cf96863-scripts\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.921832 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/57dca108-f9e5-443a-aa97-01263cf96863-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.922262 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57dca108-f9e5-443a-aa97-01263cf96863-config\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.922705 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57dca108-f9e5-443a-aa97-01263cf96863-scripts\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.927224 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/57dca108-f9e5-443a-aa97-01263cf96863-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.927361 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/57dca108-f9e5-443a-aa97-01263cf96863-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.935719 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57dca108-f9e5-443a-aa97-01263cf96863-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:44 crc kubenswrapper[4717]: I1007 14:10:44.942286 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plbh7\" (UniqueName: \"kubernetes.io/projected/57dca108-f9e5-443a-aa97-01263cf96863-kube-api-access-plbh7\") pod \"ovn-northd-0\" (UID: \"57dca108-f9e5-443a-aa97-01263cf96863\") " pod="openstack/ovn-northd-0" Oct 07 14:10:45 crc kubenswrapper[4717]: I1007 14:10:45.009301 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 07 14:10:45 crc kubenswrapper[4717]: I1007 14:10:45.430546 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 07 14:10:45 crc kubenswrapper[4717]: W1007 14:10:45.439241 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57dca108_f9e5_443a_aa97_01263cf96863.slice/crio-dae9b8f5a4c33da201143cde02dd45ac37c9091b6ec9bac88843abf22364eb72 WatchSource:0}: Error finding container dae9b8f5a4c33da201143cde02dd45ac37c9091b6ec9bac88843abf22364eb72: Status 404 returned error can't find the container with id dae9b8f5a4c33da201143cde02dd45ac37c9091b6ec9bac88843abf22364eb72 Oct 07 14:10:45 crc kubenswrapper[4717]: I1007 14:10:45.473848 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"57dca108-f9e5-443a-aa97-01263cf96863","Type":"ContainerStarted","Data":"dae9b8f5a4c33da201143cde02dd45ac37c9091b6ec9bac88843abf22364eb72"} Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:46.742465 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:46.743062 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:47.085966 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:47.086018 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:48.024554 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" podUID="5bf5f7d6-f070-4265-aa30-c53c81a623db" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.98:5353: connect: connection refused" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.370166 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.442961 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-h8d4n"] Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.464908 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-fntpl"] Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.466560 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.497593 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.497756 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvng4\" (UniqueName: \"kubernetes.io/projected/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-kube-api-access-gvng4\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.497801 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-config\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.497834 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-dns-svc\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.497867 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.498722 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-fntpl"] Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.599455 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-dns-svc\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.599531 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.599595 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.599712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvng4\" (UniqueName: \"kubernetes.io/projected/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-kube-api-access-gvng4\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.599745 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-config\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.600469 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.600500 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.600678 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-config\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.600703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-dns-svc\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.622944 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvng4\" (UniqueName: \"kubernetes.io/projected/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-kube-api-access-gvng4\") pod \"dnsmasq-dns-698758b865-fntpl\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.782690 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:49 crc kubenswrapper[4717]: I1007 14:10:49.948245 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.005100 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-config\") pod \"5bf5f7d6-f070-4265-aa30-c53c81a623db\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.005238 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn9b6\" (UniqueName: \"kubernetes.io/projected/5bf5f7d6-f070-4265-aa30-c53c81a623db-kube-api-access-mn9b6\") pod \"5bf5f7d6-f070-4265-aa30-c53c81a623db\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.005308 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-dns-svc\") pod \"5bf5f7d6-f070-4265-aa30-c53c81a623db\" (UID: \"5bf5f7d6-f070-4265-aa30-c53c81a623db\") " Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.012826 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bf5f7d6-f070-4265-aa30-c53c81a623db-kube-api-access-mn9b6" (OuterVolumeSpecName: "kube-api-access-mn9b6") pod "5bf5f7d6-f070-4265-aa30-c53c81a623db" (UID: "5bf5f7d6-f070-4265-aa30-c53c81a623db"). InnerVolumeSpecName "kube-api-access-mn9b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.064171 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-config" (OuterVolumeSpecName: "config") pod "5bf5f7d6-f070-4265-aa30-c53c81a623db" (UID: "5bf5f7d6-f070-4265-aa30-c53c81a623db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.067436 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5bf5f7d6-f070-4265-aa30-c53c81a623db" (UID: "5bf5f7d6-f070-4265-aa30-c53c81a623db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.107230 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.107263 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn9b6\" (UniqueName: \"kubernetes.io/projected/5bf5f7d6-f070-4265-aa30-c53c81a623db-kube-api-access-mn9b6\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.107273 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bf5f7d6-f070-4265-aa30-c53c81a623db-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.231768 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-fntpl"] Oct 07 14:10:50 crc kubenswrapper[4717]: W1007 14:10:50.234266 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7e24f9c_97b5_49a4_83d8_5f0c0f3b70ee.slice/crio-fe86ab0a8134eee760214faef0e5a79af497c6f7cf880f8575867627b5fbd4a9 WatchSource:0}: Error finding container fe86ab0a8134eee760214faef0e5a79af497c6f7cf880f8575867627b5fbd4a9: Status 404 returned error can't find the container with id fe86ab0a8134eee760214faef0e5a79af497c6f7cf880f8575867627b5fbd4a9 Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.528529 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" event={"ID":"5bf5f7d6-f070-4265-aa30-c53c81a623db","Type":"ContainerDied","Data":"963b549b200c84c9733b4840a172bd4d113ef56cf15aa6744ac6def2cdb3d9f0"} Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.528575 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dl9s5" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.528913 4717 scope.go:117] "RemoveContainer" containerID="d8bb6339a49cfcd397eeb46939db5d72dbc005017c4fed2428fcbed36b8c7895" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.530954 4717 generic.go:334] "Generic (PLEG): container finished" podID="4179936c-69ae-4c85-bebd-50ce6531c81f" containerID="9ec345cf9d52ad77587f4b2f420986a80c03ea0b38d0ea78ef58c203d2ef6929" exitCode=0 Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.531050 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" event={"ID":"4179936c-69ae-4c85-bebd-50ce6531c81f","Type":"ContainerDied","Data":"9ec345cf9d52ad77587f4b2f420986a80c03ea0b38d0ea78ef58c203d2ef6929"} Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.536889 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-lh5mj" event={"ID":"190219f8-b7b5-4cbd-ab5b-3fd1880f9eef","Type":"ContainerStarted","Data":"3bc7dc70245b5e52e7da3ab17ec0edd87808d2d0f8ec7c4453649756c2249bf7"} Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.539180 4717 generic.go:334] "Generic (PLEG): container finished" podID="8c7a43c4-a720-4699-8343-3daeebaef9c2" containerID="e22f5d800ef06f1bd70e1547a032967dea9f2c3bfeaa00b71c92e049dd7e4a03" exitCode=0 Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.539304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" event={"ID":"8c7a43c4-a720-4699-8343-3daeebaef9c2","Type":"ContainerDied","Data":"e22f5d800ef06f1bd70e1547a032967dea9f2c3bfeaa00b71c92e049dd7e4a03"} Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.544638 4717 generic.go:334] "Generic (PLEG): container finished" podID="d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" containerID="c9619941bb0c85b5f2fee87d908473f97cf889cc89d4681be856c98b5c12286f" exitCode=0 Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.544689 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-fntpl" event={"ID":"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee","Type":"ContainerDied","Data":"c9619941bb0c85b5f2fee87d908473f97cf889cc89d4681be856c98b5c12286f"} Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.544717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-fntpl" event={"ID":"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee","Type":"ContainerStarted","Data":"fe86ab0a8134eee760214faef0e5a79af497c6f7cf880f8575867627b5fbd4a9"} Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.562270 4717 scope.go:117] "RemoveContainer" containerID="f8da64cd006dd4cae567ac55a59baa5f3b409b95c88ae90d323ae06c1b0b49ca" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.604129 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-lh5mj" podStartSLOduration=8.604091646 podStartE2EDuration="8.604091646s" podCreationTimestamp="2025-10-07 14:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:10:50.568335875 +0000 UTC m=+1032.396261677" watchObservedRunningTime="2025-10-07 14:10:50.604091646 +0000 UTC m=+1032.432017438" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.605292 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 07 14:10:50 crc kubenswrapper[4717]: E1007 14:10:50.605662 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf5f7d6-f070-4265-aa30-c53c81a623db" containerName="dnsmasq-dns" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.605679 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf5f7d6-f070-4265-aa30-c53c81a623db" containerName="dnsmasq-dns" Oct 07 14:10:50 crc kubenswrapper[4717]: E1007 14:10:50.605696 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf5f7d6-f070-4265-aa30-c53c81a623db" containerName="init" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.605706 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf5f7d6-f070-4265-aa30-c53c81a623db" containerName="init" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.605879 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf5f7d6-f070-4265-aa30-c53c81a623db" containerName="dnsmasq-dns" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.688952 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.699569 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.699982 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.700211 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-9h96n" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.701350 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.753377 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.753425 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjvwk\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-kube-api-access-sjvwk\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.753455 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.753484 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d8f5328-2247-4e10-8d15-9902887bd75f-lock\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.753571 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d8f5328-2247-4e10-8d15-9902887bd75f-cache\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.754502 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.782982 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dl9s5"] Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.791115 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dl9s5"] Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.854448 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d8f5328-2247-4e10-8d15-9902887bd75f-cache\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.854508 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.854527 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjvwk\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-kube-api-access-sjvwk\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.854550 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.854573 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d8f5328-2247-4e10-8d15-9902887bd75f-lock\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.855125 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4d8f5328-2247-4e10-8d15-9902887bd75f-lock\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.855331 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4d8f5328-2247-4e10-8d15-9902887bd75f-cache\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.855577 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: E1007 14:10:50.857059 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 14:10:50 crc kubenswrapper[4717]: E1007 14:10:50.857088 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 14:10:50 crc kubenswrapper[4717]: E1007 14:10:50.857139 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift podName:4d8f5328-2247-4e10-8d15-9902887bd75f nodeName:}" failed. No retries permitted until 2025-10-07 14:10:51.357119618 +0000 UTC m=+1033.185045410 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift") pod "swift-storage-0" (UID: "4d8f5328-2247-4e10-8d15-9902887bd75f") : configmap "swift-ring-files" not found Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.877224 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjvwk\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-kube-api-access-sjvwk\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.877436 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.885188 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bf5f7d6-f070-4265-aa30-c53c81a623db" path="/var/lib/kubelet/pods/5bf5f7d6-f070-4265-aa30-c53c81a623db/volumes" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.939706 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:50 crc kubenswrapper[4717]: I1007 14:10:50.984337 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.060321 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-ovsdbserver-sb\") pod \"4179936c-69ae-4c85-bebd-50ce6531c81f\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.060383 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-dns-svc\") pod \"4179936c-69ae-4c85-bebd-50ce6531c81f\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.060453 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbgcj\" (UniqueName: \"kubernetes.io/projected/4179936c-69ae-4c85-bebd-50ce6531c81f-kube-api-access-cbgcj\") pod \"4179936c-69ae-4c85-bebd-50ce6531c81f\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.060488 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-config\") pod \"4179936c-69ae-4c85-bebd-50ce6531c81f\" (UID: \"4179936c-69ae-4c85-bebd-50ce6531c81f\") " Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.063760 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4179936c-69ae-4c85-bebd-50ce6531c81f-kube-api-access-cbgcj" (OuterVolumeSpecName: "kube-api-access-cbgcj") pod "4179936c-69ae-4c85-bebd-50ce6531c81f" (UID: "4179936c-69ae-4c85-bebd-50ce6531c81f"). InnerVolumeSpecName "kube-api-access-cbgcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.079592 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-config" (OuterVolumeSpecName: "config") pod "4179936c-69ae-4c85-bebd-50ce6531c81f" (UID: "4179936c-69ae-4c85-bebd-50ce6531c81f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.080712 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4179936c-69ae-4c85-bebd-50ce6531c81f" (UID: "4179936c-69ae-4c85-bebd-50ce6531c81f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.085819 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4179936c-69ae-4c85-bebd-50ce6531c81f" (UID: "4179936c-69ae-4c85-bebd-50ce6531c81f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.161753 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-nb\") pod \"8c7a43c4-a720-4699-8343-3daeebaef9c2\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.161840 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-sb\") pod \"8c7a43c4-a720-4699-8343-3daeebaef9c2\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.161944 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srfjr\" (UniqueName: \"kubernetes.io/projected/8c7a43c4-a720-4699-8343-3daeebaef9c2-kube-api-access-srfjr\") pod \"8c7a43c4-a720-4699-8343-3daeebaef9c2\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.161973 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-dns-svc\") pod \"8c7a43c4-a720-4699-8343-3daeebaef9c2\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.162025 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-config\") pod \"8c7a43c4-a720-4699-8343-3daeebaef9c2\" (UID: \"8c7a43c4-a720-4699-8343-3daeebaef9c2\") " Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.162355 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.162373 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.162385 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4179936c-69ae-4c85-bebd-50ce6531c81f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.162394 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbgcj\" (UniqueName: \"kubernetes.io/projected/4179936c-69ae-4c85-bebd-50ce6531c81f-kube-api-access-cbgcj\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.165119 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c7a43c4-a720-4699-8343-3daeebaef9c2-kube-api-access-srfjr" (OuterVolumeSpecName: "kube-api-access-srfjr") pod "8c7a43c4-a720-4699-8343-3daeebaef9c2" (UID: "8c7a43c4-a720-4699-8343-3daeebaef9c2"). InnerVolumeSpecName "kube-api-access-srfjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.180910 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c7a43c4-a720-4699-8343-3daeebaef9c2" (UID: "8c7a43c4-a720-4699-8343-3daeebaef9c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.188875 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c7a43c4-a720-4699-8343-3daeebaef9c2" (UID: "8c7a43c4-a720-4699-8343-3daeebaef9c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.191873 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-config" (OuterVolumeSpecName: "config") pod "8c7a43c4-a720-4699-8343-3daeebaef9c2" (UID: "8c7a43c4-a720-4699-8343-3daeebaef9c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.194109 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c7a43c4-a720-4699-8343-3daeebaef9c2" (UID: "8c7a43c4-a720-4699-8343-3daeebaef9c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.264157 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.264189 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.264201 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srfjr\" (UniqueName: \"kubernetes.io/projected/8c7a43c4-a720-4699-8343-3daeebaef9c2-kube-api-access-srfjr\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.264212 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.264221 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c7a43c4-a720-4699-8343-3daeebaef9c2-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.365509 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:51 crc kubenswrapper[4717]: E1007 14:10:51.365692 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 14:10:51 crc kubenswrapper[4717]: E1007 14:10:51.365960 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 14:10:51 crc kubenswrapper[4717]: E1007 14:10:51.366030 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift podName:4d8f5328-2247-4e10-8d15-9902887bd75f nodeName:}" failed. No retries permitted until 2025-10-07 14:10:52.365995241 +0000 UTC m=+1034.193921033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift") pod "swift-storage-0" (UID: "4d8f5328-2247-4e10-8d15-9902887bd75f") : configmap "swift-ring-files" not found Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.554778 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-fntpl" event={"ID":"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee","Type":"ContainerStarted","Data":"9a107a91c7fda81a56b6cf1c9d245cdd07be8bb2bd52886e53b767ec56762d48"} Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.554856 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.558203 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" event={"ID":"4179936c-69ae-4c85-bebd-50ce6531c81f","Type":"ContainerDied","Data":"66d2e8c0eecf72284c13902f5e325efda38e11ef9f695c12d8d333c4a799bfb1"} Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.558301 4717 scope.go:117] "RemoveContainer" containerID="9ec345cf9d52ad77587f4b2f420986a80c03ea0b38d0ea78ef58c203d2ef6929" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.558303 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-hgk6p" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.560702 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"57dca108-f9e5-443a-aa97-01263cf96863","Type":"ContainerStarted","Data":"2a7f450136e98d056702d116e97ccd93fd4328dbf214675d03c712ff9b4d788f"} Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.560737 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"57dca108-f9e5-443a-aa97-01263cf96863","Type":"ContainerStarted","Data":"8897489c235982b243d293cc1692d61696acf46c2b617a1b6c44989de290a77d"} Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.561276 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.562724 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" event={"ID":"8c7a43c4-a720-4699-8343-3daeebaef9c2","Type":"ContainerDied","Data":"0ecaa712a02585d2b724d3f96591e777d34aa77f0d06a8f4ed3bcdc11befe564"} Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.562784 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-h8d4n" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.589227 4717 scope.go:117] "RemoveContainer" containerID="e22f5d800ef06f1bd70e1547a032967dea9f2c3bfeaa00b71c92e049dd7e4a03" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.598471 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-fntpl" podStartSLOduration=2.598449493 podStartE2EDuration="2.598449493s" podCreationTimestamp="2025-10-07 14:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:10:51.577692668 +0000 UTC m=+1033.405618470" watchObservedRunningTime="2025-10-07 14:10:51.598449493 +0000 UTC m=+1033.426375295" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.621917 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.624631155 podStartE2EDuration="7.621892413s" podCreationTimestamp="2025-10-07 14:10:44 +0000 UTC" firstStartedPulling="2025-10-07 14:10:45.441333302 +0000 UTC m=+1027.269259094" lastFinishedPulling="2025-10-07 14:10:50.43859454 +0000 UTC m=+1032.266520352" observedRunningTime="2025-10-07 14:10:51.614411175 +0000 UTC m=+1033.442336987" watchObservedRunningTime="2025-10-07 14:10:51.621892413 +0000 UTC m=+1033.449818225" Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.657752 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hgk6p"] Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.662759 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-hgk6p"] Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.689785 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-h8d4n"] Oct 07 14:10:51 crc kubenswrapper[4717]: I1007 14:10:51.690041 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-h8d4n"] Oct 07 14:10:52 crc kubenswrapper[4717]: I1007 14:10:52.385173 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:52 crc kubenswrapper[4717]: E1007 14:10:52.385427 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 14:10:52 crc kubenswrapper[4717]: E1007 14:10:52.385912 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 14:10:52 crc kubenswrapper[4717]: E1007 14:10:52.386029 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift podName:4d8f5328-2247-4e10-8d15-9902887bd75f nodeName:}" failed. No retries permitted until 2025-10-07 14:10:54.385978158 +0000 UTC m=+1036.213903950 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift") pod "swift-storage-0" (UID: "4d8f5328-2247-4e10-8d15-9902887bd75f") : configmap "swift-ring-files" not found Oct 07 14:10:52 crc kubenswrapper[4717]: I1007 14:10:52.840430 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 07 14:10:52 crc kubenswrapper[4717]: I1007 14:10:52.881704 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4179936c-69ae-4c85-bebd-50ce6531c81f" path="/var/lib/kubelet/pods/4179936c-69ae-4c85-bebd-50ce6531c81f/volumes" Oct 07 14:10:52 crc kubenswrapper[4717]: I1007 14:10:52.882553 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c7a43c4-a720-4699-8343-3daeebaef9c2" path="/var/lib/kubelet/pods/8c7a43c4-a720-4699-8343-3daeebaef9c2/volumes" Oct 07 14:10:52 crc kubenswrapper[4717]: I1007 14:10:52.894910 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 07 14:10:53 crc kubenswrapper[4717]: I1007 14:10:53.152622 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:53 crc kubenswrapper[4717]: I1007 14:10:53.194524 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="e3c49618-d6f6-4379-8ac1-b474b0ffdeea" containerName="galera" probeResult="failure" output=< Oct 07 14:10:53 crc kubenswrapper[4717]: wsrep_local_state_comment (Joined) differs from Synced Oct 07 14:10:53 crc kubenswrapper[4717]: > Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.420228 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:54 crc kubenswrapper[4717]: E1007 14:10:54.420448 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 14:10:54 crc kubenswrapper[4717]: E1007 14:10:54.420585 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 14:10:54 crc kubenswrapper[4717]: E1007 14:10:54.420643 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift podName:4d8f5328-2247-4e10-8d15-9902887bd75f nodeName:}" failed. No retries permitted until 2025-10-07 14:10:58.420627223 +0000 UTC m=+1040.248553015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift") pod "swift-storage-0" (UID: "4d8f5328-2247-4e10-8d15-9902887bd75f") : configmap "swift-ring-files" not found Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.536332 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jgbc8"] Oct 07 14:10:54 crc kubenswrapper[4717]: E1007 14:10:54.536718 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c7a43c4-a720-4699-8343-3daeebaef9c2" containerName="init" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.536743 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c7a43c4-a720-4699-8343-3daeebaef9c2" containerName="init" Oct 07 14:10:54 crc kubenswrapper[4717]: E1007 14:10:54.536760 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4179936c-69ae-4c85-bebd-50ce6531c81f" containerName="init" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.536768 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4179936c-69ae-4c85-bebd-50ce6531c81f" containerName="init" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.536959 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4179936c-69ae-4c85-bebd-50ce6531c81f" containerName="init" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.536979 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c7a43c4-a720-4699-8343-3daeebaef9c2" containerName="init" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.537655 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.541251 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.543821 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.543956 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.551560 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jgbc8"] Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.622999 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-combined-ca-bundle\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.623071 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvd7p\" (UniqueName: \"kubernetes.io/projected/bf613982-6142-4491-931b-ad2e2b2b637f-kube-api-access-rvd7p\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.623117 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-dispersionconf\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.623164 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-swiftconf\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.623200 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf613982-6142-4491-931b-ad2e2b2b637f-etc-swift\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.623221 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-ring-data-devices\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.623332 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-scripts\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.724709 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-dispersionconf\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.724769 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-swiftconf\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.724789 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf613982-6142-4491-931b-ad2e2b2b637f-etc-swift\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.724816 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-ring-data-devices\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.724904 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-scripts\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.724960 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-combined-ca-bundle\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.725031 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvd7p\" (UniqueName: \"kubernetes.io/projected/bf613982-6142-4491-931b-ad2e2b2b637f-kube-api-access-rvd7p\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.725646 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf613982-6142-4491-931b-ad2e2b2b637f-etc-swift\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.725920 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-ring-data-devices\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.726179 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-scripts\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.730312 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-combined-ca-bundle\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.731393 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-dispersionconf\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.734265 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-swiftconf\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.743085 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvd7p\" (UniqueName: \"kubernetes.io/projected/bf613982-6142-4491-931b-ad2e2b2b637f-kube-api-access-rvd7p\") pod \"swift-ring-rebalance-jgbc8\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:54 crc kubenswrapper[4717]: I1007 14:10:54.853574 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:10:55 crc kubenswrapper[4717]: I1007 14:10:55.272234 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jgbc8"] Oct 07 14:10:55 crc kubenswrapper[4717]: I1007 14:10:55.599752 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jgbc8" event={"ID":"bf613982-6142-4491-931b-ad2e2b2b637f","Type":"ContainerStarted","Data":"c926c2309d87f66de82bfd2e38dd0fb9636bd5662d4ce9b953cfa0700d504035"} Oct 07 14:10:56 crc kubenswrapper[4717]: I1007 14:10:56.954424 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-5flzp"] Oct 07 14:10:56 crc kubenswrapper[4717]: I1007 14:10:56.955702 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5flzp" Oct 07 14:10:56 crc kubenswrapper[4717]: I1007 14:10:56.967853 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5flzp"] Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.066468 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rtdn\" (UniqueName: \"kubernetes.io/projected/c364d515-73f8-4e3f-9fde-a6f583e3a807-kube-api-access-4rtdn\") pod \"keystone-db-create-5flzp\" (UID: \"c364d515-73f8-4e3f-9fde-a6f583e3a807\") " pod="openstack/keystone-db-create-5flzp" Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.124277 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4qpnj"] Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.125290 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4qpnj" Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.137307 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4qpnj"] Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.150848 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.168705 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rtdn\" (UniqueName: \"kubernetes.io/projected/c364d515-73f8-4e3f-9fde-a6f583e3a807-kube-api-access-4rtdn\") pod \"keystone-db-create-5flzp\" (UID: \"c364d515-73f8-4e3f-9fde-a6f583e3a807\") " pod="openstack/keystone-db-create-5flzp" Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.188548 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rtdn\" (UniqueName: \"kubernetes.io/projected/c364d515-73f8-4e3f-9fde-a6f583e3a807-kube-api-access-4rtdn\") pod \"keystone-db-create-5flzp\" (UID: \"c364d515-73f8-4e3f-9fde-a6f583e3a807\") " pod="openstack/keystone-db-create-5flzp" Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.269831 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qtf8\" (UniqueName: \"kubernetes.io/projected/69e50833-7e85-44b4-926f-e088d5a065fc-kube-api-access-5qtf8\") pod \"placement-db-create-4qpnj\" (UID: \"69e50833-7e85-44b4-926f-e088d5a065fc\") " pod="openstack/placement-db-create-4qpnj" Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.274206 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5flzp" Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.371669 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qtf8\" (UniqueName: \"kubernetes.io/projected/69e50833-7e85-44b4-926f-e088d5a065fc-kube-api-access-5qtf8\") pod \"placement-db-create-4qpnj\" (UID: \"69e50833-7e85-44b4-926f-e088d5a065fc\") " pod="openstack/placement-db-create-4qpnj" Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.390913 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qtf8\" (UniqueName: \"kubernetes.io/projected/69e50833-7e85-44b4-926f-e088d5a065fc-kube-api-access-5qtf8\") pod \"placement-db-create-4qpnj\" (UID: \"69e50833-7e85-44b4-926f-e088d5a065fc\") " pod="openstack/placement-db-create-4qpnj" Oct 07 14:10:57 crc kubenswrapper[4717]: I1007 14:10:57.451960 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4qpnj" Oct 07 14:10:58 crc kubenswrapper[4717]: I1007 14:10:58.494235 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:10:58 crc kubenswrapper[4717]: E1007 14:10:58.494407 4717 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 07 14:10:58 crc kubenswrapper[4717]: E1007 14:10:58.494623 4717 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 07 14:10:58 crc kubenswrapper[4717]: E1007 14:10:58.494669 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift podName:4d8f5328-2247-4e10-8d15-9902887bd75f nodeName:}" failed. No retries permitted until 2025-10-07 14:11:06.494654496 +0000 UTC m=+1048.322580288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift") pod "swift-storage-0" (UID: "4d8f5328-2247-4e10-8d15-9902887bd75f") : configmap "swift-ring-files" not found Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.118683 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4qpnj"] Oct 07 14:10:59 crc kubenswrapper[4717]: W1007 14:10:59.123377 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69e50833_7e85_44b4_926f_e088d5a065fc.slice/crio-4ca7ca35b5a29dfd42e74e833171dc568bffb1ef5a7e32aa3a51cf56f0f7ea8c WatchSource:0}: Error finding container 4ca7ca35b5a29dfd42e74e833171dc568bffb1ef5a7e32aa3a51cf56f0f7ea8c: Status 404 returned error can't find the container with id 4ca7ca35b5a29dfd42e74e833171dc568bffb1ef5a7e32aa3a51cf56f0f7ea8c Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.181531 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-5flzp"] Oct 07 14:10:59 crc kubenswrapper[4717]: W1007 14:10:59.198967 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc364d515_73f8_4e3f_9fde_a6f583e3a807.slice/crio-e3ff9d0f62617e3f424bfc26c6a0f71d7c2900c65ad82df494f9c9ab539719f2 WatchSource:0}: Error finding container e3ff9d0f62617e3f424bfc26c6a0f71d7c2900c65ad82df494f9c9ab539719f2: Status 404 returned error can't find the container with id e3ff9d0f62617e3f424bfc26c6a0f71d7c2900c65ad82df494f9c9ab539719f2 Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.637146 4717 generic.go:334] "Generic (PLEG): container finished" podID="c364d515-73f8-4e3f-9fde-a6f583e3a807" containerID="86d6c2d66b82940e46906fda1852fd5202cdbe872128d88161e1ed60eb1f3bb6" exitCode=0 Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.637354 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5flzp" event={"ID":"c364d515-73f8-4e3f-9fde-a6f583e3a807","Type":"ContainerDied","Data":"86d6c2d66b82940e46906fda1852fd5202cdbe872128d88161e1ed60eb1f3bb6"} Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.637450 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5flzp" event={"ID":"c364d515-73f8-4e3f-9fde-a6f583e3a807","Type":"ContainerStarted","Data":"e3ff9d0f62617e3f424bfc26c6a0f71d7c2900c65ad82df494f9c9ab539719f2"} Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.641481 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jgbc8" event={"ID":"bf613982-6142-4491-931b-ad2e2b2b637f","Type":"ContainerStarted","Data":"e6133cc95f5985172240a6c6f07709c8ccc999ba70a98a1bef8b46fa968f307d"} Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.642771 4717 generic.go:334] "Generic (PLEG): container finished" podID="69e50833-7e85-44b4-926f-e088d5a065fc" containerID="37e0eb90e1710d6d1246629f4fe05af36cb0ae73d218124feac13fd54755bd53" exitCode=0 Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.642794 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4qpnj" event={"ID":"69e50833-7e85-44b4-926f-e088d5a065fc","Type":"ContainerDied","Data":"37e0eb90e1710d6d1246629f4fe05af36cb0ae73d218124feac13fd54755bd53"} Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.642807 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4qpnj" event={"ID":"69e50833-7e85-44b4-926f-e088d5a065fc","Type":"ContainerStarted","Data":"4ca7ca35b5a29dfd42e74e833171dc568bffb1ef5a7e32aa3a51cf56f0f7ea8c"} Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.695780 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-jgbc8" podStartSLOduration=2.239610962 podStartE2EDuration="5.69576052s" podCreationTimestamp="2025-10-07 14:10:54 +0000 UTC" firstStartedPulling="2025-10-07 14:10:55.274675471 +0000 UTC m=+1037.102601273" lastFinishedPulling="2025-10-07 14:10:58.730825029 +0000 UTC m=+1040.558750831" observedRunningTime="2025-10-07 14:10:59.688476258 +0000 UTC m=+1041.516402050" watchObservedRunningTime="2025-10-07 14:10:59.69576052 +0000 UTC m=+1041.523686312" Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.785227 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.844484 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-r9qng"] Oct 07 14:10:59 crc kubenswrapper[4717]: I1007 14:10:59.844759 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" podUID="35731f7d-c41a-4572-99b7-67459d507e0d" containerName="dnsmasq-dns" containerID="cri-o://eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4" gracePeriod=10 Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.082807 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.310199 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.429277 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-dns-svc\") pod \"35731f7d-c41a-4572-99b7-67459d507e0d\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.429440 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg448\" (UniqueName: \"kubernetes.io/projected/35731f7d-c41a-4572-99b7-67459d507e0d-kube-api-access-sg448\") pod \"35731f7d-c41a-4572-99b7-67459d507e0d\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.429480 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-config\") pod \"35731f7d-c41a-4572-99b7-67459d507e0d\" (UID: \"35731f7d-c41a-4572-99b7-67459d507e0d\") " Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.440478 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35731f7d-c41a-4572-99b7-67459d507e0d-kube-api-access-sg448" (OuterVolumeSpecName: "kube-api-access-sg448") pod "35731f7d-c41a-4572-99b7-67459d507e0d" (UID: "35731f7d-c41a-4572-99b7-67459d507e0d"). InnerVolumeSpecName "kube-api-access-sg448". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.468622 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "35731f7d-c41a-4572-99b7-67459d507e0d" (UID: "35731f7d-c41a-4572-99b7-67459d507e0d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.485790 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-config" (OuterVolumeSpecName: "config") pod "35731f7d-c41a-4572-99b7-67459d507e0d" (UID: "35731f7d-c41a-4572-99b7-67459d507e0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.531776 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg448\" (UniqueName: \"kubernetes.io/projected/35731f7d-c41a-4572-99b7-67459d507e0d-kube-api-access-sg448\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.531819 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.531831 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/35731f7d-c41a-4572-99b7-67459d507e0d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.651193 4717 generic.go:334] "Generic (PLEG): container finished" podID="35731f7d-c41a-4572-99b7-67459d507e0d" containerID="eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4" exitCode=0 Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.651273 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.651304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" event={"ID":"35731f7d-c41a-4572-99b7-67459d507e0d","Type":"ContainerDied","Data":"eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4"} Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.652670 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-r9qng" event={"ID":"35731f7d-c41a-4572-99b7-67459d507e0d","Type":"ContainerDied","Data":"94d754d111d9b3896f9ad967a68ed28a401557c0f8ad6b5c793f8eb15451154f"} Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.652698 4717 scope.go:117] "RemoveContainer" containerID="eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.675552 4717 scope.go:117] "RemoveContainer" containerID="1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.704462 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-r9qng"] Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.717772 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-r9qng"] Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.723570 4717 scope.go:117] "RemoveContainer" containerID="eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4" Oct 07 14:11:00 crc kubenswrapper[4717]: E1007 14:11:00.724674 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4\": container with ID starting with eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4 not found: ID does not exist" containerID="eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.724737 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4"} err="failed to get container status \"eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4\": rpc error: code = NotFound desc = could not find container \"eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4\": container with ID starting with eed88b2b65ffb4363010831713ff81cdd4b0a305c374f61881579eca1e67ace4 not found: ID does not exist" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.724771 4717 scope.go:117] "RemoveContainer" containerID="1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2" Oct 07 14:11:00 crc kubenswrapper[4717]: E1007 14:11:00.725251 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2\": container with ID starting with 1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2 not found: ID does not exist" containerID="1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.725284 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2"} err="failed to get container status \"1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2\": rpc error: code = NotFound desc = could not find container \"1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2\": container with ID starting with 1e8465dcfccec6d11c00d517d6218c338438d9c4c0706d53236545d5c444f6a2 not found: ID does not exist" Oct 07 14:11:00 crc kubenswrapper[4717]: I1007 14:11:00.882134 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35731f7d-c41a-4572-99b7-67459d507e0d" path="/var/lib/kubelet/pods/35731f7d-c41a-4572-99b7-67459d507e0d/volumes" Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.029784 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4qpnj" Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.036937 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5flzp" Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.143037 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qtf8\" (UniqueName: \"kubernetes.io/projected/69e50833-7e85-44b4-926f-e088d5a065fc-kube-api-access-5qtf8\") pod \"69e50833-7e85-44b4-926f-e088d5a065fc\" (UID: \"69e50833-7e85-44b4-926f-e088d5a065fc\") " Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.145405 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rtdn\" (UniqueName: \"kubernetes.io/projected/c364d515-73f8-4e3f-9fde-a6f583e3a807-kube-api-access-4rtdn\") pod \"c364d515-73f8-4e3f-9fde-a6f583e3a807\" (UID: \"c364d515-73f8-4e3f-9fde-a6f583e3a807\") " Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.151861 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e50833-7e85-44b4-926f-e088d5a065fc-kube-api-access-5qtf8" (OuterVolumeSpecName: "kube-api-access-5qtf8") pod "69e50833-7e85-44b4-926f-e088d5a065fc" (UID: "69e50833-7e85-44b4-926f-e088d5a065fc"). InnerVolumeSpecName "kube-api-access-5qtf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.151974 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c364d515-73f8-4e3f-9fde-a6f583e3a807-kube-api-access-4rtdn" (OuterVolumeSpecName: "kube-api-access-4rtdn") pod "c364d515-73f8-4e3f-9fde-a6f583e3a807" (UID: "c364d515-73f8-4e3f-9fde-a6f583e3a807"). InnerVolumeSpecName "kube-api-access-4rtdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.247820 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rtdn\" (UniqueName: \"kubernetes.io/projected/c364d515-73f8-4e3f-9fde-a6f583e3a807-kube-api-access-4rtdn\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.247860 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qtf8\" (UniqueName: \"kubernetes.io/projected/69e50833-7e85-44b4-926f-e088d5a065fc-kube-api-access-5qtf8\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.660141 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4qpnj" Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.660146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4qpnj" event={"ID":"69e50833-7e85-44b4-926f-e088d5a065fc","Type":"ContainerDied","Data":"4ca7ca35b5a29dfd42e74e833171dc568bffb1ef5a7e32aa3a51cf56f0f7ea8c"} Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.660273 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ca7ca35b5a29dfd42e74e833171dc568bffb1ef5a7e32aa3a51cf56f0f7ea8c" Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.663170 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-5flzp" event={"ID":"c364d515-73f8-4e3f-9fde-a6f583e3a807","Type":"ContainerDied","Data":"e3ff9d0f62617e3f424bfc26c6a0f71d7c2900c65ad82df494f9c9ab539719f2"} Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.663216 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3ff9d0f62617e3f424bfc26c6a0f71d7c2900c65ad82df494f9c9ab539719f2" Oct 07 14:11:01 crc kubenswrapper[4717]: I1007 14:11:01.663237 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-5flzp" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.395363 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-kdghz"] Oct 07 14:11:02 crc kubenswrapper[4717]: E1007 14:11:02.396133 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35731f7d-c41a-4572-99b7-67459d507e0d" containerName="init" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.396151 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="35731f7d-c41a-4572-99b7-67459d507e0d" containerName="init" Oct 07 14:11:02 crc kubenswrapper[4717]: E1007 14:11:02.396179 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35731f7d-c41a-4572-99b7-67459d507e0d" containerName="dnsmasq-dns" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.396187 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="35731f7d-c41a-4572-99b7-67459d507e0d" containerName="dnsmasq-dns" Oct 07 14:11:02 crc kubenswrapper[4717]: E1007 14:11:02.396197 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e50833-7e85-44b4-926f-e088d5a065fc" containerName="mariadb-database-create" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.396207 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e50833-7e85-44b4-926f-e088d5a065fc" containerName="mariadb-database-create" Oct 07 14:11:02 crc kubenswrapper[4717]: E1007 14:11:02.396227 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c364d515-73f8-4e3f-9fde-a6f583e3a807" containerName="mariadb-database-create" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.396234 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c364d515-73f8-4e3f-9fde-a6f583e3a807" containerName="mariadb-database-create" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.396434 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e50833-7e85-44b4-926f-e088d5a065fc" containerName="mariadb-database-create" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.396447 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c364d515-73f8-4e3f-9fde-a6f583e3a807" containerName="mariadb-database-create" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.396467 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="35731f7d-c41a-4572-99b7-67459d507e0d" containerName="dnsmasq-dns" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.397092 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kdghz" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.402672 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kdghz"] Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.472161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhfr2\" (UniqueName: \"kubernetes.io/projected/5961e0b0-7403-4c58-ae35-1609815f5c4b-kube-api-access-lhfr2\") pod \"glance-db-create-kdghz\" (UID: \"5961e0b0-7403-4c58-ae35-1609815f5c4b\") " pod="openstack/glance-db-create-kdghz" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.573444 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhfr2\" (UniqueName: \"kubernetes.io/projected/5961e0b0-7403-4c58-ae35-1609815f5c4b-kube-api-access-lhfr2\") pod \"glance-db-create-kdghz\" (UID: \"5961e0b0-7403-4c58-ae35-1609815f5c4b\") " pod="openstack/glance-db-create-kdghz" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.592511 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhfr2\" (UniqueName: \"kubernetes.io/projected/5961e0b0-7403-4c58-ae35-1609815f5c4b-kube-api-access-lhfr2\") pod \"glance-db-create-kdghz\" (UID: \"5961e0b0-7403-4c58-ae35-1609815f5c4b\") " pod="openstack/glance-db-create-kdghz" Oct 07 14:11:02 crc kubenswrapper[4717]: I1007 14:11:02.718237 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kdghz" Oct 07 14:11:03 crc kubenswrapper[4717]: I1007 14:11:03.162869 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-kdghz"] Oct 07 14:11:03 crc kubenswrapper[4717]: I1007 14:11:03.678357 4717 generic.go:334] "Generic (PLEG): container finished" podID="5961e0b0-7403-4c58-ae35-1609815f5c4b" containerID="62c02fefd3080b733dbf933a450921789724e4301348ad6d7bc0d29c4e853849" exitCode=0 Oct 07 14:11:03 crc kubenswrapper[4717]: I1007 14:11:03.678413 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kdghz" event={"ID":"5961e0b0-7403-4c58-ae35-1609815f5c4b","Type":"ContainerDied","Data":"62c02fefd3080b733dbf933a450921789724e4301348ad6d7bc0d29c4e853849"} Oct 07 14:11:03 crc kubenswrapper[4717]: I1007 14:11:03.678448 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kdghz" event={"ID":"5961e0b0-7403-4c58-ae35-1609815f5c4b","Type":"ContainerStarted","Data":"01d78d4d2a45ebf50c7a67f8f2d30529bdcd4d054e3525ba87fab89f89ba3ab7"} Oct 07 14:11:04 crc kubenswrapper[4717]: I1007 14:11:04.961606 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kdghz" Oct 07 14:11:05 crc kubenswrapper[4717]: I1007 14:11:05.121924 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhfr2\" (UniqueName: \"kubernetes.io/projected/5961e0b0-7403-4c58-ae35-1609815f5c4b-kube-api-access-lhfr2\") pod \"5961e0b0-7403-4c58-ae35-1609815f5c4b\" (UID: \"5961e0b0-7403-4c58-ae35-1609815f5c4b\") " Oct 07 14:11:05 crc kubenswrapper[4717]: I1007 14:11:05.139278 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5961e0b0-7403-4c58-ae35-1609815f5c4b-kube-api-access-lhfr2" (OuterVolumeSpecName: "kube-api-access-lhfr2") pod "5961e0b0-7403-4c58-ae35-1609815f5c4b" (UID: "5961e0b0-7403-4c58-ae35-1609815f5c4b"). InnerVolumeSpecName "kube-api-access-lhfr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:05 crc kubenswrapper[4717]: I1007 14:11:05.224187 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhfr2\" (UniqueName: \"kubernetes.io/projected/5961e0b0-7403-4c58-ae35-1609815f5c4b-kube-api-access-lhfr2\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:05 crc kubenswrapper[4717]: I1007 14:11:05.694918 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-kdghz" Oct 07 14:11:05 crc kubenswrapper[4717]: I1007 14:11:05.695109 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-kdghz" event={"ID":"5961e0b0-7403-4c58-ae35-1609815f5c4b","Type":"ContainerDied","Data":"01d78d4d2a45ebf50c7a67f8f2d30529bdcd4d054e3525ba87fab89f89ba3ab7"} Oct 07 14:11:05 crc kubenswrapper[4717]: I1007 14:11:05.695349 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01d78d4d2a45ebf50c7a67f8f2d30529bdcd4d054e3525ba87fab89f89ba3ab7" Oct 07 14:11:05 crc kubenswrapper[4717]: I1007 14:11:05.696236 4717 generic.go:334] "Generic (PLEG): container finished" podID="bf613982-6142-4491-931b-ad2e2b2b637f" containerID="e6133cc95f5985172240a6c6f07709c8ccc999ba70a98a1bef8b46fa968f307d" exitCode=0 Oct 07 14:11:05 crc kubenswrapper[4717]: I1007 14:11:05.696276 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jgbc8" event={"ID":"bf613982-6142-4491-931b-ad2e2b2b637f","Type":"ContainerDied","Data":"e6133cc95f5985172240a6c6f07709c8ccc999ba70a98a1bef8b46fa968f307d"} Oct 07 14:11:06 crc kubenswrapper[4717]: I1007 14:11:06.543640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:11:06 crc kubenswrapper[4717]: I1007 14:11:06.554586 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4d8f5328-2247-4e10-8d15-9902887bd75f-etc-swift\") pod \"swift-storage-0\" (UID: \"4d8f5328-2247-4e10-8d15-9902887bd75f\") " pod="openstack/swift-storage-0" Oct 07 14:11:06 crc kubenswrapper[4717]: I1007 14:11:06.794103 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.052765 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.152590 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-dispersionconf\") pod \"bf613982-6142-4491-931b-ad2e2b2b637f\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.152643 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-scripts\") pod \"bf613982-6142-4491-931b-ad2e2b2b637f\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.152695 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-combined-ca-bundle\") pod \"bf613982-6142-4491-931b-ad2e2b2b637f\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.152735 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvd7p\" (UniqueName: \"kubernetes.io/projected/bf613982-6142-4491-931b-ad2e2b2b637f-kube-api-access-rvd7p\") pod \"bf613982-6142-4491-931b-ad2e2b2b637f\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.152831 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-ring-data-devices\") pod \"bf613982-6142-4491-931b-ad2e2b2b637f\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.152866 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-swiftconf\") pod \"bf613982-6142-4491-931b-ad2e2b2b637f\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.152883 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf613982-6142-4491-931b-ad2e2b2b637f-etc-swift\") pod \"bf613982-6142-4491-931b-ad2e2b2b637f\" (UID: \"bf613982-6142-4491-931b-ad2e2b2b637f\") " Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.154486 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bf613982-6142-4491-931b-ad2e2b2b637f" (UID: "bf613982-6142-4491-931b-ad2e2b2b637f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.154673 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf613982-6142-4491-931b-ad2e2b2b637f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bf613982-6142-4491-931b-ad2e2b2b637f" (UID: "bf613982-6142-4491-931b-ad2e2b2b637f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.155424 4717 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.155451 4717 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf613982-6142-4491-931b-ad2e2b2b637f-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.158978 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf613982-6142-4491-931b-ad2e2b2b637f-kube-api-access-rvd7p" (OuterVolumeSpecName: "kube-api-access-rvd7p") pod "bf613982-6142-4491-931b-ad2e2b2b637f" (UID: "bf613982-6142-4491-931b-ad2e2b2b637f"). InnerVolumeSpecName "kube-api-access-rvd7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.161437 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bf613982-6142-4491-931b-ad2e2b2b637f" (UID: "bf613982-6142-4491-931b-ad2e2b2b637f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.175537 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-scripts" (OuterVolumeSpecName: "scripts") pod "bf613982-6142-4491-931b-ad2e2b2b637f" (UID: "bf613982-6142-4491-931b-ad2e2b2b637f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.176147 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf613982-6142-4491-931b-ad2e2b2b637f" (UID: "bf613982-6142-4491-931b-ad2e2b2b637f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.176967 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bf613982-6142-4491-931b-ad2e2b2b637f" (UID: "bf613982-6142-4491-931b-ad2e2b2b637f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.259934 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvd7p\" (UniqueName: \"kubernetes.io/projected/bf613982-6142-4491-931b-ad2e2b2b637f-kube-api-access-rvd7p\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.260232 4717 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.260245 4717 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.260256 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf613982-6142-4491-931b-ad2e2b2b637f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.260264 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf613982-6142-4491-931b-ad2e2b2b637f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.278843 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d269-account-create-9kgwm"] Oct 07 14:11:07 crc kubenswrapper[4717]: E1007 14:11:07.279251 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf613982-6142-4491-931b-ad2e2b2b637f" containerName="swift-ring-rebalance" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.279274 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf613982-6142-4491-931b-ad2e2b2b637f" containerName="swift-ring-rebalance" Oct 07 14:11:07 crc kubenswrapper[4717]: E1007 14:11:07.279303 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5961e0b0-7403-4c58-ae35-1609815f5c4b" containerName="mariadb-database-create" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.279311 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="5961e0b0-7403-4c58-ae35-1609815f5c4b" containerName="mariadb-database-create" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.279500 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="5961e0b0-7403-4c58-ae35-1609815f5c4b" containerName="mariadb-database-create" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.279525 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf613982-6142-4491-931b-ad2e2b2b637f" containerName="swift-ring-rebalance" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.280285 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d269-account-create-9kgwm" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.282948 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.288085 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d269-account-create-9kgwm"] Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.351325 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 07 14:11:07 crc kubenswrapper[4717]: W1007 14:11:07.355998 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d8f5328_2247_4e10_8d15_9902887bd75f.slice/crio-71f614d3ea6bb7b8b9d3e603e0491b8a3c15900bad7d5bb5c891c624fb17c112 WatchSource:0}: Error finding container 71f614d3ea6bb7b8b9d3e603e0491b8a3c15900bad7d5bb5c891c624fb17c112: Status 404 returned error can't find the container with id 71f614d3ea6bb7b8b9d3e603e0491b8a3c15900bad7d5bb5c891c624fb17c112 Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.361603 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b995b\" (UniqueName: \"kubernetes.io/projected/eed7e8d9-d828-4710-be5c-91e84e17a057-kube-api-access-b995b\") pod \"placement-d269-account-create-9kgwm\" (UID: \"eed7e8d9-d828-4710-be5c-91e84e17a057\") " pod="openstack/placement-d269-account-create-9kgwm" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.462924 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b995b\" (UniqueName: \"kubernetes.io/projected/eed7e8d9-d828-4710-be5c-91e84e17a057-kube-api-access-b995b\") pod \"placement-d269-account-create-9kgwm\" (UID: \"eed7e8d9-d828-4710-be5c-91e84e17a057\") " pod="openstack/placement-d269-account-create-9kgwm" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.479742 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b995b\" (UniqueName: \"kubernetes.io/projected/eed7e8d9-d828-4710-be5c-91e84e17a057-kube-api-access-b995b\") pod \"placement-d269-account-create-9kgwm\" (UID: \"eed7e8d9-d828-4710-be5c-91e84e17a057\") " pod="openstack/placement-d269-account-create-9kgwm" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.536296 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vr6v8" podUID="f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c" containerName="ovn-controller" probeResult="failure" output=< Oct 07 14:11:07 crc kubenswrapper[4717]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 14:11:07 crc kubenswrapper[4717]: > Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.598043 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d269-account-create-9kgwm" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.714301 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jgbc8" event={"ID":"bf613982-6142-4491-931b-ad2e2b2b637f","Type":"ContainerDied","Data":"c926c2309d87f66de82bfd2e38dd0fb9636bd5662d4ce9b953cfa0700d504035"} Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.714590 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c926c2309d87f66de82bfd2e38dd0fb9636bd5662d4ce9b953cfa0700d504035" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.714348 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jgbc8" Oct 07 14:11:07 crc kubenswrapper[4717]: I1007 14:11:07.715148 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"71f614d3ea6bb7b8b9d3e603e0491b8a3c15900bad7d5bb5c891c624fb17c112"} Oct 07 14:11:08 crc kubenswrapper[4717]: I1007 14:11:08.014376 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d269-account-create-9kgwm"] Oct 07 14:11:08 crc kubenswrapper[4717]: W1007 14:11:08.016960 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeed7e8d9_d828_4710_be5c_91e84e17a057.slice/crio-3b928ced2a958acd061e79b57483452344e6ae381f6f837b1741002f3b6d75fb WatchSource:0}: Error finding container 3b928ced2a958acd061e79b57483452344e6ae381f6f837b1741002f3b6d75fb: Status 404 returned error can't find the container with id 3b928ced2a958acd061e79b57483452344e6ae381f6f837b1741002f3b6d75fb Oct 07 14:11:08 crc kubenswrapper[4717]: I1007 14:11:08.725504 4717 generic.go:334] "Generic (PLEG): container finished" podID="eed7e8d9-d828-4710-be5c-91e84e17a057" containerID="e286a66694dbfb7cc28174d045b746e8d7dcc189935265bf4e2d671d25aac995" exitCode=0 Oct 07 14:11:08 crc kubenswrapper[4717]: I1007 14:11:08.725690 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d269-account-create-9kgwm" event={"ID":"eed7e8d9-d828-4710-be5c-91e84e17a057","Type":"ContainerDied","Data":"e286a66694dbfb7cc28174d045b746e8d7dcc189935265bf4e2d671d25aac995"} Oct 07 14:11:08 crc kubenswrapper[4717]: I1007 14:11:08.726231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d269-account-create-9kgwm" event={"ID":"eed7e8d9-d828-4710-be5c-91e84e17a057","Type":"ContainerStarted","Data":"3b928ced2a958acd061e79b57483452344e6ae381f6f837b1741002f3b6d75fb"} Oct 07 14:11:08 crc kubenswrapper[4717]: I1007 14:11:08.731939 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"a25ea0330987b07592d0d48e1328356d692fae288999876d32a0fb14cd83c280"} Oct 07 14:11:08 crc kubenswrapper[4717]: I1007 14:11:08.731980 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"9a4cccf8b8223b10a9e06d615196788ae7b52061882a232885026f766ffe365f"} Oct 07 14:11:09 crc kubenswrapper[4717]: I1007 14:11:09.740783 4717 generic.go:334] "Generic (PLEG): container finished" podID="82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" containerID="08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b" exitCode=0 Oct 07 14:11:09 crc kubenswrapper[4717]: I1007 14:11:09.740859 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2","Type":"ContainerDied","Data":"08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b"} Oct 07 14:11:09 crc kubenswrapper[4717]: I1007 14:11:09.748945 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"88e51dc42764d3638a1b0ea103ad884422498d113477270fe2404d66d7417bad"} Oct 07 14:11:09 crc kubenswrapper[4717]: I1007 14:11:09.749082 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"389c83df1b8b484909328d3553215ba991ce8f4db25e2e8dc2c5d01765eb2de8"} Oct 07 14:11:09 crc kubenswrapper[4717]: I1007 14:11:09.753537 4717 generic.go:334] "Generic (PLEG): container finished" podID="29b16c62-141d-4bf5-ba4c-79590bdd39cd" containerID="610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3" exitCode=0 Oct 07 14:11:09 crc kubenswrapper[4717]: I1007 14:11:09.753750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"29b16c62-141d-4bf5-ba4c-79590bdd39cd","Type":"ContainerDied","Data":"610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3"} Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.433533 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d269-account-create-9kgwm" Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.516728 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b995b\" (UniqueName: \"kubernetes.io/projected/eed7e8d9-d828-4710-be5c-91e84e17a057-kube-api-access-b995b\") pod \"eed7e8d9-d828-4710-be5c-91e84e17a057\" (UID: \"eed7e8d9-d828-4710-be5c-91e84e17a057\") " Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.523125 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed7e8d9-d828-4710-be5c-91e84e17a057-kube-api-access-b995b" (OuterVolumeSpecName: "kube-api-access-b995b") pod "eed7e8d9-d828-4710-be5c-91e84e17a057" (UID: "eed7e8d9-d828-4710-be5c-91e84e17a057"). InnerVolumeSpecName "kube-api-access-b995b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.618182 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b995b\" (UniqueName: \"kubernetes.io/projected/eed7e8d9-d828-4710-be5c-91e84e17a057-kube-api-access-b995b\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.766843 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"29b16c62-141d-4bf5-ba4c-79590bdd39cd","Type":"ContainerStarted","Data":"719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89"} Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.768653 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.769384 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2","Type":"ContainerStarted","Data":"1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e"} Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.769610 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.771573 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d269-account-create-9kgwm" event={"ID":"eed7e8d9-d828-4710-be5c-91e84e17a057","Type":"ContainerDied","Data":"3b928ced2a958acd061e79b57483452344e6ae381f6f837b1741002f3b6d75fb"} Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.771607 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b928ced2a958acd061e79b57483452344e6ae381f6f837b1741002f3b6d75fb" Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.771658 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d269-account-create-9kgwm" Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.788336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"2604e26fc2dd9253f53206ac800b8701488f55c8e5687d5333367b084c7824e8"} Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.788391 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"afec398246c23b099b8c7effff5020fe8812141357e78f93713f0434c9845ef9"} Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.801835 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.353978565 podStartE2EDuration="58.80175295s" podCreationTimestamp="2025-10-07 14:10:12 +0000 UTC" firstStartedPulling="2025-10-07 14:10:27.316339088 +0000 UTC m=+1009.144264880" lastFinishedPulling="2025-10-07 14:10:33.764113463 +0000 UTC m=+1015.592039265" observedRunningTime="2025-10-07 14:11:10.797115431 +0000 UTC m=+1052.625041243" watchObservedRunningTime="2025-10-07 14:11:10.80175295 +0000 UTC m=+1052.629678742" Oct 07 14:11:10 crc kubenswrapper[4717]: I1007 14:11:10.826533 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=51.029219321 podStartE2EDuration="58.826516016s" podCreationTimestamp="2025-10-07 14:10:12 +0000 UTC" firstStartedPulling="2025-10-07 14:10:25.966784847 +0000 UTC m=+1007.794710629" lastFinishedPulling="2025-10-07 14:10:33.764081532 +0000 UTC m=+1015.592007324" observedRunningTime="2025-10-07 14:11:10.822429382 +0000 UTC m=+1052.650355184" watchObservedRunningTime="2025-10-07 14:11:10.826516016 +0000 UTC m=+1052.654441808" Oct 07 14:11:11 crc kubenswrapper[4717]: I1007 14:11:11.799186 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"2a7bbbfbc7bdccdd1e049e5ba38e10bd5bc880d94a0827711b24b56fd03bfa39"} Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.502129 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bf89-account-create-h42p7"] Oct 07 14:11:12 crc kubenswrapper[4717]: E1007 14:11:12.502574 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed7e8d9-d828-4710-be5c-91e84e17a057" containerName="mariadb-account-create" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.502605 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed7e8d9-d828-4710-be5c-91e84e17a057" containerName="mariadb-account-create" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.502815 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed7e8d9-d828-4710-be5c-91e84e17a057" containerName="mariadb-account-create" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.503546 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf89-account-create-h42p7" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.512565 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.519663 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf89-account-create-h42p7"] Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.540537 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vr6v8" podUID="f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c" containerName="ovn-controller" probeResult="failure" output=< Oct 07 14:11:12 crc kubenswrapper[4717]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 07 14:11:12 crc kubenswrapper[4717]: > Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.559655 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv9vg\" (UniqueName: \"kubernetes.io/projected/6e58acfa-d9c6-44d0-bff3-b12663e9095e-kube-api-access-xv9vg\") pod \"glance-bf89-account-create-h42p7\" (UID: \"6e58acfa-d9c6-44d0-bff3-b12663e9095e\") " pod="openstack/glance-bf89-account-create-h42p7" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.569739 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.572398 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wvknm" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.661654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv9vg\" (UniqueName: \"kubernetes.io/projected/6e58acfa-d9c6-44d0-bff3-b12663e9095e-kube-api-access-xv9vg\") pod \"glance-bf89-account-create-h42p7\" (UID: \"6e58acfa-d9c6-44d0-bff3-b12663e9095e\") " pod="openstack/glance-bf89-account-create-h42p7" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.693512 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv9vg\" (UniqueName: \"kubernetes.io/projected/6e58acfa-d9c6-44d0-bff3-b12663e9095e-kube-api-access-xv9vg\") pod \"glance-bf89-account-create-h42p7\" (UID: \"6e58acfa-d9c6-44d0-bff3-b12663e9095e\") " pod="openstack/glance-bf89-account-create-h42p7" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.799071 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vr6v8-config-95t9c"] Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.800342 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.810979 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.825601 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf89-account-create-h42p7" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.832260 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vr6v8-config-95t9c"] Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.864946 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-log-ovn\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.865079 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run-ovn\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.865117 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.865146 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-scripts\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.865172 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-additional-scripts\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.865188 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh72h\" (UniqueName: \"kubernetes.io/projected/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-kube-api-access-xh72h\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.966242 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.966301 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-scripts\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.966339 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-additional-scripts\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.966364 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh72h\" (UniqueName: \"kubernetes.io/projected/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-kube-api-access-xh72h\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.966402 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-log-ovn\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.966487 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run-ovn\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.967158 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run-ovn\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.967119 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.967392 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-log-ovn\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.968388 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-additional-scripts\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.970202 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-scripts\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:12 crc kubenswrapper[4717]: I1007 14:11:12.986081 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh72h\" (UniqueName: \"kubernetes.io/projected/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-kube-api-access-xh72h\") pod \"ovn-controller-vr6v8-config-95t9c\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:13 crc kubenswrapper[4717]: I1007 14:11:13.126117 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:13 crc kubenswrapper[4717]: I1007 14:11:13.294042 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bf89-account-create-h42p7"] Oct 07 14:11:13 crc kubenswrapper[4717]: W1007 14:11:13.301583 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e58acfa_d9c6_44d0_bff3_b12663e9095e.slice/crio-86ef3b9fc2d8580b13d4f21f1c27ce3388c2c31c6504784bcbed8df6a83b1813 WatchSource:0}: Error finding container 86ef3b9fc2d8580b13d4f21f1c27ce3388c2c31c6504784bcbed8df6a83b1813: Status 404 returned error can't find the container with id 86ef3b9fc2d8580b13d4f21f1c27ce3388c2c31c6504784bcbed8df6a83b1813 Oct 07 14:11:13 crc kubenswrapper[4717]: I1007 14:11:13.453962 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vr6v8-config-95t9c"] Oct 07 14:11:13 crc kubenswrapper[4717]: I1007 14:11:13.833244 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"9fa86377b7b86db547cbad31414e8848667196cde5a9f9a4b8aa0630d09a2e19"} Oct 07 14:11:13 crc kubenswrapper[4717]: I1007 14:11:13.834834 4717 generic.go:334] "Generic (PLEG): container finished" podID="6e58acfa-d9c6-44d0-bff3-b12663e9095e" containerID="375756edbedb6d7becdb928395e15978f91c835d1390007026c0238bde491bc2" exitCode=0 Oct 07 14:11:13 crc kubenswrapper[4717]: I1007 14:11:13.835200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf89-account-create-h42p7" event={"ID":"6e58acfa-d9c6-44d0-bff3-b12663e9095e","Type":"ContainerDied","Data":"375756edbedb6d7becdb928395e15978f91c835d1390007026c0238bde491bc2"} Oct 07 14:11:13 crc kubenswrapper[4717]: I1007 14:11:13.835226 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf89-account-create-h42p7" event={"ID":"6e58acfa-d9c6-44d0-bff3-b12663e9095e","Type":"ContainerStarted","Data":"86ef3b9fc2d8580b13d4f21f1c27ce3388c2c31c6504784bcbed8df6a83b1813"} Oct 07 14:11:13 crc kubenswrapper[4717]: I1007 14:11:13.839021 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vr6v8-config-95t9c" event={"ID":"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e","Type":"ContainerStarted","Data":"0ff22ed15aa209e6635b16c1e44097438429e5b94e6f6069a757c16e326293d4"} Oct 07 14:11:13 crc kubenswrapper[4717]: I1007 14:11:13.839057 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vr6v8-config-95t9c" event={"ID":"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e","Type":"ContainerStarted","Data":"2071cce5b899cd6631b0dbe1cf423b70d647827fac3b92170f5203379f391d8b"} Oct 07 14:11:13 crc kubenswrapper[4717]: I1007 14:11:13.868881 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vr6v8-config-95t9c" podStartSLOduration=1.868857397 podStartE2EDuration="1.868857397s" podCreationTimestamp="2025-10-07 14:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:13.86175653 +0000 UTC m=+1055.689682332" watchObservedRunningTime="2025-10-07 14:11:13.868857397 +0000 UTC m=+1055.696783189" Oct 07 14:11:14 crc kubenswrapper[4717]: I1007 14:11:14.849398 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"b5b9215ad59ffb99ac8b7ab94d32072d4481325561ef61bdf46a546b4c94073b"} Oct 07 14:11:14 crc kubenswrapper[4717]: I1007 14:11:14.850949 4717 generic.go:334] "Generic (PLEG): container finished" podID="a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" containerID="0ff22ed15aa209e6635b16c1e44097438429e5b94e6f6069a757c16e326293d4" exitCode=0 Oct 07 14:11:14 crc kubenswrapper[4717]: I1007 14:11:14.851044 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vr6v8-config-95t9c" event={"ID":"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e","Type":"ContainerDied","Data":"0ff22ed15aa209e6635b16c1e44097438429e5b94e6f6069a757c16e326293d4"} Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.310022 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf89-account-create-h42p7" Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.440660 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv9vg\" (UniqueName: \"kubernetes.io/projected/6e58acfa-d9c6-44d0-bff3-b12663e9095e-kube-api-access-xv9vg\") pod \"6e58acfa-d9c6-44d0-bff3-b12663e9095e\" (UID: \"6e58acfa-d9c6-44d0-bff3-b12663e9095e\") " Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.446423 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e58acfa-d9c6-44d0-bff3-b12663e9095e-kube-api-access-xv9vg" (OuterVolumeSpecName: "kube-api-access-xv9vg") pod "6e58acfa-d9c6-44d0-bff3-b12663e9095e" (UID: "6e58acfa-d9c6-44d0-bff3-b12663e9095e"). InnerVolumeSpecName "kube-api-access-xv9vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.542811 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv9vg\" (UniqueName: \"kubernetes.io/projected/6e58acfa-d9c6-44d0-bff3-b12663e9095e-kube-api-access-xv9vg\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.865750 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"d462aae1683ca725c2cc9933fc5b0569cf87c3e66a9478e30a0668d67ec160b7"} Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.865802 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"2e73b49775d2ebc3b5e5738b5028e453ddbdf4ead98c7e94fd8bd43343d9bbc3"} Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.865816 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"9359239a9f2b34ebadddb4d366c0baa3203e3de64b7d29318de05e1f97b5fa57"} Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.865828 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"69915b53992ac9a3b5d3c71f6bdf117a85183ea7b18a53e1246ead573d224327"} Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.865839 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"5d58667e48c703e98773fc2288ff6790c99d6ab160b46a40ebcc62583b5e4b8d"} Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.865850 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"4d8f5328-2247-4e10-8d15-9902887bd75f","Type":"ContainerStarted","Data":"ce8ebf4535033096170c0d6000c1ca03c8b8101047595e049169f2fb3f3fdc48"} Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.867729 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bf89-account-create-h42p7" event={"ID":"6e58acfa-d9c6-44d0-bff3-b12663e9095e","Type":"ContainerDied","Data":"86ef3b9fc2d8580b13d4f21f1c27ce3388c2c31c6504784bcbed8df6a83b1813"} Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.867778 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86ef3b9fc2d8580b13d4f21f1c27ce3388c2c31c6504784bcbed8df6a83b1813" Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.867843 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bf89-account-create-h42p7" Oct 07 14:11:15 crc kubenswrapper[4717]: I1007 14:11:15.911820 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.717289433 podStartE2EDuration="26.911802591s" podCreationTimestamp="2025-10-07 14:10:49 +0000 UTC" firstStartedPulling="2025-10-07 14:11:07.357934956 +0000 UTC m=+1049.185860748" lastFinishedPulling="2025-10-07 14:11:14.552448114 +0000 UTC m=+1056.380373906" observedRunningTime="2025-10-07 14:11:15.908592982 +0000 UTC m=+1057.736518804" watchObservedRunningTime="2025-10-07 14:11:15.911802591 +0000 UTC m=+1057.739728373" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.235202 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-jm8l7"] Oct 07 14:11:16 crc kubenswrapper[4717]: E1007 14:11:16.235790 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e58acfa-d9c6-44d0-bff3-b12663e9095e" containerName="mariadb-account-create" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.235806 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e58acfa-d9c6-44d0-bff3-b12663e9095e" containerName="mariadb-account-create" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.236020 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e58acfa-d9c6-44d0-bff3-b12663e9095e" containerName="mariadb-account-create" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.236856 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.242332 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.254638 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-jm8l7"] Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.281319 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.364145 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.364210 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-config\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.364237 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.364258 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p88sw\" (UniqueName: \"kubernetes.io/projected/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-kube-api-access-p88sw\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.364492 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.364587 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.465802 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-scripts\") pod \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.465869 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run-ovn\") pod \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.465919 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run\") pod \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.465988 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-log-ovn\") pod \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466022 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" (UID: "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466074 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run" (OuterVolumeSpecName: "var-run") pod "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" (UID: "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466137 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh72h\" (UniqueName: \"kubernetes.io/projected/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-kube-api-access-xh72h\") pod \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466168 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" (UID: "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466272 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-additional-scripts\") pod \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\" (UID: \"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e\") " Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466579 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466626 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-config\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466680 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p88sw\" (UniqueName: \"kubernetes.io/projected/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-kube-api-access-p88sw\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466708 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" (UID: "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466699 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-scripts" (OuterVolumeSpecName: "scripts") pod "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" (UID: "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466812 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.466863 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.467482 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.467544 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.467601 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.467998 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.468099 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-config\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.468262 4717 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.468378 4717 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.468399 4717 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.468414 4717 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.468811 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.470270 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-kube-api-access-xh72h" (OuterVolumeSpecName: "kube-api-access-xh72h") pod "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" (UID: "a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e"). InnerVolumeSpecName "kube-api-access-xh72h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.485279 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p88sw\" (UniqueName: \"kubernetes.io/projected/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-kube-api-access-p88sw\") pod \"dnsmasq-dns-77585f5f8c-jm8l7\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.569815 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh72h\" (UniqueName: \"kubernetes.io/projected/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e-kube-api-access-xh72h\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.595511 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.877710 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vr6v8-config-95t9c" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.878877 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vr6v8-config-95t9c" event={"ID":"a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e","Type":"ContainerDied","Data":"2071cce5b899cd6631b0dbe1cf423b70d647827fac3b92170f5203379f391d8b"} Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.878917 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2071cce5b899cd6631b0dbe1cf423b70d647827fac3b92170f5203379f391d8b" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.966277 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vr6v8-config-95t9c"] Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.976677 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vr6v8-config-95t9c"] Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.982753 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-0fb7-account-create-h62sp"] Oct 07 14:11:16 crc kubenswrapper[4717]: E1007 14:11:16.983119 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" containerName="ovn-config" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.983131 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" containerName="ovn-config" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.983286 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" containerName="ovn-config" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.983805 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fb7-account-create-h62sp" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.990284 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 07 14:11:16 crc kubenswrapper[4717]: I1007 14:11:16.997630 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0fb7-account-create-h62sp"] Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.058990 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-jm8l7"] Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.178191 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwdxt\" (UniqueName: \"kubernetes.io/projected/a976f8f9-576f-4e2d-bf4c-f2bed066725b-kube-api-access-nwdxt\") pod \"keystone-0fb7-account-create-h62sp\" (UID: \"a976f8f9-576f-4e2d-bf4c-f2bed066725b\") " pod="openstack/keystone-0fb7-account-create-h62sp" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.279728 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwdxt\" (UniqueName: \"kubernetes.io/projected/a976f8f9-576f-4e2d-bf4c-f2bed066725b-kube-api-access-nwdxt\") pod \"keystone-0fb7-account-create-h62sp\" (UID: \"a976f8f9-576f-4e2d-bf4c-f2bed066725b\") " pod="openstack/keystone-0fb7-account-create-h62sp" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.295349 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwdxt\" (UniqueName: \"kubernetes.io/projected/a976f8f9-576f-4e2d-bf4c-f2bed066725b-kube-api-access-nwdxt\") pod \"keystone-0fb7-account-create-h62sp\" (UID: \"a976f8f9-576f-4e2d-bf4c-f2bed066725b\") " pod="openstack/keystone-0fb7-account-create-h62sp" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.299556 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fb7-account-create-h62sp" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.585746 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vr6v8" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.608776 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jkk7w"] Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.610082 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.612520 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qgtbp" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.623351 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jkk7w"] Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.630171 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.730083 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-0fb7-account-create-h62sp"] Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.789416 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-db-sync-config-data\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.789500 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-combined-ca-bundle\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.789530 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5stsk\" (UniqueName: \"kubernetes.io/projected/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-kube-api-access-5stsk\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.789610 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-config-data\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.886485 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0fb7-account-create-h62sp" event={"ID":"a976f8f9-576f-4e2d-bf4c-f2bed066725b","Type":"ContainerStarted","Data":"dd3b38b6abdfaa89995f397fa2cee6fcde64e92d4ab3c6094284b0c674c6e609"} Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.888367 4717 generic.go:334] "Generic (PLEG): container finished" podID="c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" containerID="2cc165a67bfd20132b560f5504b6a1b738d4219d9839e35f4a10266814d616fd" exitCode=0 Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.888410 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" event={"ID":"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d","Type":"ContainerDied","Data":"2cc165a67bfd20132b560f5504b6a1b738d4219d9839e35f4a10266814d616fd"} Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.888435 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" event={"ID":"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d","Type":"ContainerStarted","Data":"d959487432f9bb27704b743d38a00a6112ae5080a5ae98e9dd3c3e7b52467cc9"} Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.890408 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-config-data\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.890511 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-db-sync-config-data\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.890562 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-combined-ca-bundle\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.890588 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5stsk\" (UniqueName: \"kubernetes.io/projected/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-kube-api-access-5stsk\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.898783 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-config-data\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.906761 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-combined-ca-bundle\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.906843 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-db-sync-config-data\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.913749 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5stsk\" (UniqueName: \"kubernetes.io/projected/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-kube-api-access-5stsk\") pod \"glance-db-sync-jkk7w\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:17 crc kubenswrapper[4717]: I1007 14:11:17.927448 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:18 crc kubenswrapper[4717]: I1007 14:11:18.540468 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jkk7w"] Oct 07 14:11:18 crc kubenswrapper[4717]: I1007 14:11:18.883217 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e" path="/var/lib/kubelet/pods/a84c7edd-99b1-4c9a-bbd9-f2456f4ef66e/volumes" Oct 07 14:11:18 crc kubenswrapper[4717]: I1007 14:11:18.899297 4717 generic.go:334] "Generic (PLEG): container finished" podID="a976f8f9-576f-4e2d-bf4c-f2bed066725b" containerID="d399456bc67fe32372b87ff9964ee175f2a2876534418938d4d0d674e70cb5a1" exitCode=0 Oct 07 14:11:18 crc kubenswrapper[4717]: I1007 14:11:18.899378 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0fb7-account-create-h62sp" event={"ID":"a976f8f9-576f-4e2d-bf4c-f2bed066725b","Type":"ContainerDied","Data":"d399456bc67fe32372b87ff9964ee175f2a2876534418938d4d0d674e70cb5a1"} Oct 07 14:11:18 crc kubenswrapper[4717]: I1007 14:11:18.900826 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" event={"ID":"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d","Type":"ContainerStarted","Data":"d7ac35ae2c449f8d7113e2effe4db52ef5f451dd3b526099b9b6856e055ad678"} Oct 07 14:11:18 crc kubenswrapper[4717]: I1007 14:11:18.901642 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:18 crc kubenswrapper[4717]: I1007 14:11:18.903477 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkk7w" event={"ID":"c68042ef-a515-4c4a-b1d5-82fcc3549ea6","Type":"ContainerStarted","Data":"65b0ebb33d92dcc101d9675b038a7f3f39de4ba865b51db2b7bb84a65833ae56"} Oct 07 14:11:18 crc kubenswrapper[4717]: I1007 14:11:18.938441 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" podStartSLOduration=2.938419697 podStartE2EDuration="2.938419697s" podCreationTimestamp="2025-10-07 14:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:18.937040848 +0000 UTC m=+1060.764966660" watchObservedRunningTime="2025-10-07 14:11:18.938419697 +0000 UTC m=+1060.766345509" Oct 07 14:11:20 crc kubenswrapper[4717]: I1007 14:11:20.246703 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fb7-account-create-h62sp" Oct 07 14:11:20 crc kubenswrapper[4717]: I1007 14:11:20.343809 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwdxt\" (UniqueName: \"kubernetes.io/projected/a976f8f9-576f-4e2d-bf4c-f2bed066725b-kube-api-access-nwdxt\") pod \"a976f8f9-576f-4e2d-bf4c-f2bed066725b\" (UID: \"a976f8f9-576f-4e2d-bf4c-f2bed066725b\") " Oct 07 14:11:20 crc kubenswrapper[4717]: I1007 14:11:20.356667 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a976f8f9-576f-4e2d-bf4c-f2bed066725b-kube-api-access-nwdxt" (OuterVolumeSpecName: "kube-api-access-nwdxt") pod "a976f8f9-576f-4e2d-bf4c-f2bed066725b" (UID: "a976f8f9-576f-4e2d-bf4c-f2bed066725b"). InnerVolumeSpecName "kube-api-access-nwdxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:20 crc kubenswrapper[4717]: I1007 14:11:20.445871 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwdxt\" (UniqueName: \"kubernetes.io/projected/a976f8f9-576f-4e2d-bf4c-f2bed066725b-kube-api-access-nwdxt\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:20 crc kubenswrapper[4717]: I1007 14:11:20.926550 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-0fb7-account-create-h62sp" event={"ID":"a976f8f9-576f-4e2d-bf4c-f2bed066725b","Type":"ContainerDied","Data":"dd3b38b6abdfaa89995f397fa2cee6fcde64e92d4ab3c6094284b0c674c6e609"} Oct 07 14:11:20 crc kubenswrapper[4717]: I1007 14:11:20.926591 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-0fb7-account-create-h62sp" Oct 07 14:11:20 crc kubenswrapper[4717]: I1007 14:11:20.926647 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd3b38b6abdfaa89995f397fa2cee6fcde64e92d4ab3c6094284b0c674c6e609" Oct 07 14:11:23 crc kubenswrapper[4717]: I1007 14:11:23.781215 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.144203 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.227603 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-l64ln"] Oct 07 14:11:24 crc kubenswrapper[4717]: E1007 14:11:24.228030 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a976f8f9-576f-4e2d-bf4c-f2bed066725b" containerName="mariadb-account-create" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.228054 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a976f8f9-576f-4e2d-bf4c-f2bed066725b" containerName="mariadb-account-create" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.228290 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a976f8f9-576f-4e2d-bf4c-f2bed066725b" containerName="mariadb-account-create" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.242750 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l64ln" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.244665 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-l64ln"] Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.330728 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-5d62c"] Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.332887 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5d62c" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.360183 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-5d62c"] Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.416503 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-8kmdr"] Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.417897 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.419700 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m4h7z" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.420424 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.420703 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.422430 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.433666 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ll8s\" (UniqueName: \"kubernetes.io/projected/e60f513f-7654-4b4a-b4ef-ea4637a7f364-kube-api-access-4ll8s\") pod \"manila-db-create-5d62c\" (UID: \"e60f513f-7654-4b4a-b4ef-ea4637a7f364\") " pod="openstack/manila-db-create-5d62c" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.433800 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft56m\" (UniqueName: \"kubernetes.io/projected/e05b05a1-ebbd-4437-aa14-2b6736950306-kube-api-access-ft56m\") pod \"cinder-db-create-l64ln\" (UID: \"e05b05a1-ebbd-4437-aa14-2b6736950306\") " pod="openstack/cinder-db-create-l64ln" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.451186 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8kmdr"] Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.474387 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-rmvdk"] Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.475563 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rmvdk" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.485522 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rmvdk"] Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.535597 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-config-data\") pod \"keystone-db-sync-8kmdr\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.535718 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ll8s\" (UniqueName: \"kubernetes.io/projected/e60f513f-7654-4b4a-b4ef-ea4637a7f364-kube-api-access-4ll8s\") pod \"manila-db-create-5d62c\" (UID: \"e60f513f-7654-4b4a-b4ef-ea4637a7f364\") " pod="openstack/manila-db-create-5d62c" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.535749 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv96b\" (UniqueName: \"kubernetes.io/projected/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-kube-api-access-hv96b\") pod \"keystone-db-sync-8kmdr\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.535791 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-combined-ca-bundle\") pod \"keystone-db-sync-8kmdr\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.535822 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft56m\" (UniqueName: \"kubernetes.io/projected/e05b05a1-ebbd-4437-aa14-2b6736950306-kube-api-access-ft56m\") pod \"cinder-db-create-l64ln\" (UID: \"e05b05a1-ebbd-4437-aa14-2b6736950306\") " pod="openstack/cinder-db-create-l64ln" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.542239 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-dq9k9"] Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.543551 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dq9k9" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.559475 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dq9k9"] Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.564907 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ll8s\" (UniqueName: \"kubernetes.io/projected/e60f513f-7654-4b4a-b4ef-ea4637a7f364-kube-api-access-4ll8s\") pod \"manila-db-create-5d62c\" (UID: \"e60f513f-7654-4b4a-b4ef-ea4637a7f364\") " pod="openstack/manila-db-create-5d62c" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.565532 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft56m\" (UniqueName: \"kubernetes.io/projected/e05b05a1-ebbd-4437-aa14-2b6736950306-kube-api-access-ft56m\") pod \"cinder-db-create-l64ln\" (UID: \"e05b05a1-ebbd-4437-aa14-2b6736950306\") " pod="openstack/cinder-db-create-l64ln" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.569494 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l64ln" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.637712 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t52x\" (UniqueName: \"kubernetes.io/projected/193ab1fd-3c4e-463b-9f5f-a85cbe1fd633-kube-api-access-9t52x\") pod \"barbican-db-create-rmvdk\" (UID: \"193ab1fd-3c4e-463b-9f5f-a85cbe1fd633\") " pod="openstack/barbican-db-create-rmvdk" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.637767 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-config-data\") pod \"keystone-db-sync-8kmdr\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.637970 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv96b\" (UniqueName: \"kubernetes.io/projected/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-kube-api-access-hv96b\") pod \"keystone-db-sync-8kmdr\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.638082 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-combined-ca-bundle\") pod \"keystone-db-sync-8kmdr\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.638188 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfl2b\" (UniqueName: \"kubernetes.io/projected/93c14c91-5e26-48d3-86c8-79dcb89d3c16-kube-api-access-qfl2b\") pod \"neutron-db-create-dq9k9\" (UID: \"93c14c91-5e26-48d3-86c8-79dcb89d3c16\") " pod="openstack/neutron-db-create-dq9k9" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.641423 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-config-data\") pod \"keystone-db-sync-8kmdr\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.642110 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-combined-ca-bundle\") pod \"keystone-db-sync-8kmdr\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.656265 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv96b\" (UniqueName: \"kubernetes.io/projected/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-kube-api-access-hv96b\") pod \"keystone-db-sync-8kmdr\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.657816 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5d62c" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.739827 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t52x\" (UniqueName: \"kubernetes.io/projected/193ab1fd-3c4e-463b-9f5f-a85cbe1fd633-kube-api-access-9t52x\") pod \"barbican-db-create-rmvdk\" (UID: \"193ab1fd-3c4e-463b-9f5f-a85cbe1fd633\") " pod="openstack/barbican-db-create-rmvdk" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.740256 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfl2b\" (UniqueName: \"kubernetes.io/projected/93c14c91-5e26-48d3-86c8-79dcb89d3c16-kube-api-access-qfl2b\") pod \"neutron-db-create-dq9k9\" (UID: \"93c14c91-5e26-48d3-86c8-79dcb89d3c16\") " pod="openstack/neutron-db-create-dq9k9" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.757988 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfl2b\" (UniqueName: \"kubernetes.io/projected/93c14c91-5e26-48d3-86c8-79dcb89d3c16-kube-api-access-qfl2b\") pod \"neutron-db-create-dq9k9\" (UID: \"93c14c91-5e26-48d3-86c8-79dcb89d3c16\") " pod="openstack/neutron-db-create-dq9k9" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.761634 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t52x\" (UniqueName: \"kubernetes.io/projected/193ab1fd-3c4e-463b-9f5f-a85cbe1fd633-kube-api-access-9t52x\") pod \"barbican-db-create-rmvdk\" (UID: \"193ab1fd-3c4e-463b-9f5f-a85cbe1fd633\") " pod="openstack/barbican-db-create-rmvdk" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.764604 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.796433 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rmvdk" Oct 07 14:11:24 crc kubenswrapper[4717]: I1007 14:11:24.865337 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dq9k9" Oct 07 14:11:26 crc kubenswrapper[4717]: I1007 14:11:26.597469 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:11:26 crc kubenswrapper[4717]: I1007 14:11:26.663720 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-fntpl"] Oct 07 14:11:26 crc kubenswrapper[4717]: I1007 14:11:26.664100 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-fntpl" podUID="d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" containerName="dnsmasq-dns" containerID="cri-o://9a107a91c7fda81a56b6cf1c9d245cdd07be8bb2bd52886e53b767ec56762d48" gracePeriod=10 Oct 07 14:11:26 crc kubenswrapper[4717]: I1007 14:11:26.977930 4717 generic.go:334] "Generic (PLEG): container finished" podID="d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" containerID="9a107a91c7fda81a56b6cf1c9d245cdd07be8bb2bd52886e53b767ec56762d48" exitCode=0 Oct 07 14:11:26 crc kubenswrapper[4717]: I1007 14:11:26.977981 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-fntpl" event={"ID":"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee","Type":"ContainerDied","Data":"9a107a91c7fda81a56b6cf1c9d245cdd07be8bb2bd52886e53b767ec56762d48"} Oct 07 14:11:29 crc kubenswrapper[4717]: I1007 14:11:29.784330 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-fntpl" podUID="d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.527809 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.638671 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-dns-svc\") pod \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.638729 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-nb\") pod \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.638953 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvng4\" (UniqueName: \"kubernetes.io/projected/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-kube-api-access-gvng4\") pod \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.639029 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-config\") pod \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.639056 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-sb\") pod \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\" (UID: \"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee\") " Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.643422 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-kube-api-access-gvng4" (OuterVolumeSpecName: "kube-api-access-gvng4") pod "d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" (UID: "d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee"). InnerVolumeSpecName "kube-api-access-gvng4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.676827 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-config" (OuterVolumeSpecName: "config") pod "d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" (UID: "d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.682879 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" (UID: "d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.686133 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" (UID: "d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.686782 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" (UID: "d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.740994 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvng4\" (UniqueName: \"kubernetes.io/projected/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-kube-api-access-gvng4\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.741179 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.741190 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.741200 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.741208 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.819915 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-l64ln"] Oct 07 14:11:30 crc kubenswrapper[4717]: W1007 14:11:30.828660 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode05b05a1_ebbd_4437_aa14_2b6736950306.slice/crio-f47ccf2fe15163eb4517cb4292238b4b3d40b9b4b023ecadb64be7dc113fdfff WatchSource:0}: Error finding container f47ccf2fe15163eb4517cb4292238b4b3d40b9b4b023ecadb64be7dc113fdfff: Status 404 returned error can't find the container with id f47ccf2fe15163eb4517cb4292238b4b3d40b9b4b023ecadb64be7dc113fdfff Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.957851 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-5d62c"] Oct 07 14:11:30 crc kubenswrapper[4717]: W1007 14:11:30.962601 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93c14c91_5e26_48d3_86c8_79dcb89d3c16.slice/crio-ba9d15d7b782c034d8c43601c2bd6c072077021497771560b2fa702a828540f5 WatchSource:0}: Error finding container ba9d15d7b782c034d8c43601c2bd6c072077021497771560b2fa702a828540f5: Status 404 returned error can't find the container with id ba9d15d7b782c034d8c43601c2bd6c072077021497771560b2fa702a828540f5 Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.966079 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dq9k9"] Oct 07 14:11:30 crc kubenswrapper[4717]: W1007 14:11:30.969081 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode60f513f_7654_4b4a_b4ef_ea4637a7f364.slice/crio-96bcfde36f52a76a6d19095dabf1cae9aa6f9d0d7a3345b1ebb189455e501cab WatchSource:0}: Error finding container 96bcfde36f52a76a6d19095dabf1cae9aa6f9d0d7a3345b1ebb189455e501cab: Status 404 returned error can't find the container with id 96bcfde36f52a76a6d19095dabf1cae9aa6f9d0d7a3345b1ebb189455e501cab Oct 07 14:11:30 crc kubenswrapper[4717]: I1007 14:11:30.971610 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-8kmdr"] Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.022693 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8kmdr" event={"ID":"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5","Type":"ContainerStarted","Data":"88f0528b306c6b18f96390233167093c74b33f6cc65c3d63e22ff2c7aa68f77a"} Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.046418 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dq9k9" event={"ID":"93c14c91-5e26-48d3-86c8-79dcb89d3c16","Type":"ContainerStarted","Data":"ba9d15d7b782c034d8c43601c2bd6c072077021497771560b2fa702a828540f5"} Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.047789 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5d62c" event={"ID":"e60f513f-7654-4b4a-b4ef-ea4637a7f364","Type":"ContainerStarted","Data":"96bcfde36f52a76a6d19095dabf1cae9aa6f9d0d7a3345b1ebb189455e501cab"} Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.049553 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l64ln" event={"ID":"e05b05a1-ebbd-4437-aa14-2b6736950306","Type":"ContainerStarted","Data":"457b4967b8bd1f9dd527d2982aa50fb134455c69fc8a81b663b070171a9751f0"} Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.049584 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l64ln" event={"ID":"e05b05a1-ebbd-4437-aa14-2b6736950306","Type":"ContainerStarted","Data":"f47ccf2fe15163eb4517cb4292238b4b3d40b9b4b023ecadb64be7dc113fdfff"} Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.052740 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-fntpl" event={"ID":"d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee","Type":"ContainerDied","Data":"fe86ab0a8134eee760214faef0e5a79af497c6f7cf880f8575867627b5fbd4a9"} Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.052798 4717 scope.go:117] "RemoveContainer" containerID="9a107a91c7fda81a56b6cf1c9d245cdd07be8bb2bd52886e53b767ec56762d48" Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.052832 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-fntpl" Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.064972 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-l64ln" podStartSLOduration=7.064955615 podStartE2EDuration="7.064955615s" podCreationTimestamp="2025-10-07 14:11:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:31.064877603 +0000 UTC m=+1072.892803405" watchObservedRunningTime="2025-10-07 14:11:31.064955615 +0000 UTC m=+1072.892881407" Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.085482 4717 scope.go:117] "RemoveContainer" containerID="c9619941bb0c85b5f2fee87d908473f97cf889cc89d4681be856c98b5c12286f" Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.088939 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-fntpl"] Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.096400 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-fntpl"] Oct 07 14:11:31 crc kubenswrapper[4717]: I1007 14:11:31.101909 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rmvdk"] Oct 07 14:11:31 crc kubenswrapper[4717]: W1007 14:11:31.150765 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod193ab1fd_3c4e_463b_9f5f_a85cbe1fd633.slice/crio-94742f04ca6bf3e3c3c64dd964a6281557ad8b094d2bfb5b9cf376df79ae3f12 WatchSource:0}: Error finding container 94742f04ca6bf3e3c3c64dd964a6281557ad8b094d2bfb5b9cf376df79ae3f12: Status 404 returned error can't find the container with id 94742f04ca6bf3e3c3c64dd964a6281557ad8b094d2bfb5b9cf376df79ae3f12 Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.060141 4717 generic.go:334] "Generic (PLEG): container finished" podID="93c14c91-5e26-48d3-86c8-79dcb89d3c16" containerID="20ebb6863b62e62972bb58f55e349ce82e4f592f72de0a2a477eefe6988852ee" exitCode=0 Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.060222 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dq9k9" event={"ID":"93c14c91-5e26-48d3-86c8-79dcb89d3c16","Type":"ContainerDied","Data":"20ebb6863b62e62972bb58f55e349ce82e4f592f72de0a2a477eefe6988852ee"} Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.062056 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkk7w" event={"ID":"c68042ef-a515-4c4a-b1d5-82fcc3549ea6","Type":"ContainerStarted","Data":"1a2ca749888268961419e07265521cf4631ed355bcdb7e4902963cf7249fee16"} Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.063468 4717 generic.go:334] "Generic (PLEG): container finished" podID="e60f513f-7654-4b4a-b4ef-ea4637a7f364" containerID="4454fe08906d672ec4fca90e12cdb29a788254b640a5e9c8acba8b50d65abe2e" exitCode=0 Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.063552 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5d62c" event={"ID":"e60f513f-7654-4b4a-b4ef-ea4637a7f364","Type":"ContainerDied","Data":"4454fe08906d672ec4fca90e12cdb29a788254b640a5e9c8acba8b50d65abe2e"} Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.064765 4717 generic.go:334] "Generic (PLEG): container finished" podID="e05b05a1-ebbd-4437-aa14-2b6736950306" containerID="457b4967b8bd1f9dd527d2982aa50fb134455c69fc8a81b663b070171a9751f0" exitCode=0 Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.064811 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l64ln" event={"ID":"e05b05a1-ebbd-4437-aa14-2b6736950306","Type":"ContainerDied","Data":"457b4967b8bd1f9dd527d2982aa50fb134455c69fc8a81b663b070171a9751f0"} Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.067273 4717 generic.go:334] "Generic (PLEG): container finished" podID="193ab1fd-3c4e-463b-9f5f-a85cbe1fd633" containerID="5148104e540ffb1d2141d384ee907c2ea28a83f8f8fe77595f9121ae58c67c05" exitCode=0 Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.067304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rmvdk" event={"ID":"193ab1fd-3c4e-463b-9f5f-a85cbe1fd633","Type":"ContainerDied","Data":"5148104e540ffb1d2141d384ee907c2ea28a83f8f8fe77595f9121ae58c67c05"} Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.067334 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rmvdk" event={"ID":"193ab1fd-3c4e-463b-9f5f-a85cbe1fd633","Type":"ContainerStarted","Data":"94742f04ca6bf3e3c3c64dd964a6281557ad8b094d2bfb5b9cf376df79ae3f12"} Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.112799 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jkk7w" podStartSLOduration=3.228927898 podStartE2EDuration="15.112784623s" podCreationTimestamp="2025-10-07 14:11:17 +0000 UTC" firstStartedPulling="2025-10-07 14:11:18.546261673 +0000 UTC m=+1060.374187455" lastFinishedPulling="2025-10-07 14:11:30.430118388 +0000 UTC m=+1072.258044180" observedRunningTime="2025-10-07 14:11:32.112725121 +0000 UTC m=+1073.940650913" watchObservedRunningTime="2025-10-07 14:11:32.112784623 +0000 UTC m=+1073.940710415" Oct 07 14:11:32 crc kubenswrapper[4717]: I1007 14:11:32.890668 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" path="/var/lib/kubelet/pods/d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee/volumes" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:34.999894 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dq9k9" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.053748 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5d62c" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.078801 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rmvdk" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.093429 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l64ln" event={"ID":"e05b05a1-ebbd-4437-aa14-2b6736950306","Type":"ContainerDied","Data":"f47ccf2fe15163eb4517cb4292238b4b3d40b9b4b023ecadb64be7dc113fdfff"} Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.093580 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f47ccf2fe15163eb4517cb4292238b4b3d40b9b4b023ecadb64be7dc113fdfff" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.095117 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l64ln" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.098625 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rmvdk" event={"ID":"193ab1fd-3c4e-463b-9f5f-a85cbe1fd633","Type":"ContainerDied","Data":"94742f04ca6bf3e3c3c64dd964a6281557ad8b094d2bfb5b9cf376df79ae3f12"} Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.098931 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94742f04ca6bf3e3c3c64dd964a6281557ad8b094d2bfb5b9cf376df79ae3f12" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.098963 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rmvdk" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.100839 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dq9k9" event={"ID":"93c14c91-5e26-48d3-86c8-79dcb89d3c16","Type":"ContainerDied","Data":"ba9d15d7b782c034d8c43601c2bd6c072077021497771560b2fa702a828540f5"} Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.100972 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba9d15d7b782c034d8c43601c2bd6c072077021497771560b2fa702a828540f5" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.101210 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dq9k9" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.103331 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5d62c" event={"ID":"e60f513f-7654-4b4a-b4ef-ea4637a7f364","Type":"ContainerDied","Data":"96bcfde36f52a76a6d19095dabf1cae9aa6f9d0d7a3345b1ebb189455e501cab"} Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.103400 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96bcfde36f52a76a6d19095dabf1cae9aa6f9d0d7a3345b1ebb189455e501cab" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.103437 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5d62c" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.112058 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfl2b\" (UniqueName: \"kubernetes.io/projected/93c14c91-5e26-48d3-86c8-79dcb89d3c16-kube-api-access-qfl2b\") pod \"93c14c91-5e26-48d3-86c8-79dcb89d3c16\" (UID: \"93c14c91-5e26-48d3-86c8-79dcb89d3c16\") " Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.130348 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c14c91-5e26-48d3-86c8-79dcb89d3c16-kube-api-access-qfl2b" (OuterVolumeSpecName: "kube-api-access-qfl2b") pod "93c14c91-5e26-48d3-86c8-79dcb89d3c16" (UID: "93c14c91-5e26-48d3-86c8-79dcb89d3c16"). InnerVolumeSpecName "kube-api-access-qfl2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.213083 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ll8s\" (UniqueName: \"kubernetes.io/projected/e60f513f-7654-4b4a-b4ef-ea4637a7f364-kube-api-access-4ll8s\") pod \"e60f513f-7654-4b4a-b4ef-ea4637a7f364\" (UID: \"e60f513f-7654-4b4a-b4ef-ea4637a7f364\") " Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.213494 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t52x\" (UniqueName: \"kubernetes.io/projected/193ab1fd-3c4e-463b-9f5f-a85cbe1fd633-kube-api-access-9t52x\") pod \"193ab1fd-3c4e-463b-9f5f-a85cbe1fd633\" (UID: \"193ab1fd-3c4e-463b-9f5f-a85cbe1fd633\") " Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.213586 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ft56m\" (UniqueName: \"kubernetes.io/projected/e05b05a1-ebbd-4437-aa14-2b6736950306-kube-api-access-ft56m\") pod \"e05b05a1-ebbd-4437-aa14-2b6736950306\" (UID: \"e05b05a1-ebbd-4437-aa14-2b6736950306\") " Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.213982 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfl2b\" (UniqueName: \"kubernetes.io/projected/93c14c91-5e26-48d3-86c8-79dcb89d3c16-kube-api-access-qfl2b\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.217624 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e60f513f-7654-4b4a-b4ef-ea4637a7f364-kube-api-access-4ll8s" (OuterVolumeSpecName: "kube-api-access-4ll8s") pod "e60f513f-7654-4b4a-b4ef-ea4637a7f364" (UID: "e60f513f-7654-4b4a-b4ef-ea4637a7f364"). InnerVolumeSpecName "kube-api-access-4ll8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.217703 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193ab1fd-3c4e-463b-9f5f-a85cbe1fd633-kube-api-access-9t52x" (OuterVolumeSpecName: "kube-api-access-9t52x") pod "193ab1fd-3c4e-463b-9f5f-a85cbe1fd633" (UID: "193ab1fd-3c4e-463b-9f5f-a85cbe1fd633"). InnerVolumeSpecName "kube-api-access-9t52x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.217793 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e05b05a1-ebbd-4437-aa14-2b6736950306-kube-api-access-ft56m" (OuterVolumeSpecName: "kube-api-access-ft56m") pod "e05b05a1-ebbd-4437-aa14-2b6736950306" (UID: "e05b05a1-ebbd-4437-aa14-2b6736950306"). InnerVolumeSpecName "kube-api-access-ft56m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.316199 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ll8s\" (UniqueName: \"kubernetes.io/projected/e60f513f-7654-4b4a-b4ef-ea4637a7f364-kube-api-access-4ll8s\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.316599 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t52x\" (UniqueName: \"kubernetes.io/projected/193ab1fd-3c4e-463b-9f5f-a85cbe1fd633-kube-api-access-9t52x\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:35 crc kubenswrapper[4717]: I1007 14:11:35.316614 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ft56m\" (UniqueName: \"kubernetes.io/projected/e05b05a1-ebbd-4437-aa14-2b6736950306-kube-api-access-ft56m\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:36 crc kubenswrapper[4717]: I1007 14:11:36.113874 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8kmdr" event={"ID":"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5","Type":"ContainerStarted","Data":"807e6086481132bbd39a602a72f55a985855bb36106610a0040511af613d0c5d"} Oct 07 14:11:36 crc kubenswrapper[4717]: I1007 14:11:36.113890 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l64ln" Oct 07 14:11:36 crc kubenswrapper[4717]: I1007 14:11:36.143907 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-8kmdr" podStartSLOduration=8.208076632 podStartE2EDuration="12.143884855s" podCreationTimestamp="2025-10-07 14:11:24 +0000 UTC" firstStartedPulling="2025-10-07 14:11:30.987842888 +0000 UTC m=+1072.815768680" lastFinishedPulling="2025-10-07 14:11:34.923651111 +0000 UTC m=+1076.751576903" observedRunningTime="2025-10-07 14:11:36.135303807 +0000 UTC m=+1077.963229609" watchObservedRunningTime="2025-10-07 14:11:36.143884855 +0000 UTC m=+1077.971810657" Oct 07 14:11:38 crc kubenswrapper[4717]: I1007 14:11:38.129934 4717 generic.go:334] "Generic (PLEG): container finished" podID="842ad804-63f4-4fb3-9f2e-f7e70c91f3a5" containerID="807e6086481132bbd39a602a72f55a985855bb36106610a0040511af613d0c5d" exitCode=0 Oct 07 14:11:38 crc kubenswrapper[4717]: I1007 14:11:38.130028 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8kmdr" event={"ID":"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5","Type":"ContainerDied","Data":"807e6086481132bbd39a602a72f55a985855bb36106610a0040511af613d0c5d"} Oct 07 14:11:38 crc kubenswrapper[4717]: I1007 14:11:38.131676 4717 generic.go:334] "Generic (PLEG): container finished" podID="c68042ef-a515-4c4a-b1d5-82fcc3549ea6" containerID="1a2ca749888268961419e07265521cf4631ed355bcdb7e4902963cf7249fee16" exitCode=0 Oct 07 14:11:38 crc kubenswrapper[4717]: I1007 14:11:38.131711 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkk7w" event={"ID":"c68042ef-a515-4c4a-b1d5-82fcc3549ea6","Type":"ContainerDied","Data":"1a2ca749888268961419e07265521cf4631ed355bcdb7e4902963cf7249fee16"} Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.446681 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.538420 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.582853 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv96b\" (UniqueName: \"kubernetes.io/projected/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-kube-api-access-hv96b\") pod \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.582927 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-config-data\") pod \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.582988 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-combined-ca-bundle\") pod \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\" (UID: \"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5\") " Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.589386 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-kube-api-access-hv96b" (OuterVolumeSpecName: "kube-api-access-hv96b") pod "842ad804-63f4-4fb3-9f2e-f7e70c91f3a5" (UID: "842ad804-63f4-4fb3-9f2e-f7e70c91f3a5"). InnerVolumeSpecName "kube-api-access-hv96b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.613244 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "842ad804-63f4-4fb3-9f2e-f7e70c91f3a5" (UID: "842ad804-63f4-4fb3-9f2e-f7e70c91f3a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.624516 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-config-data" (OuterVolumeSpecName: "config-data") pod "842ad804-63f4-4fb3-9f2e-f7e70c91f3a5" (UID: "842ad804-63f4-4fb3-9f2e-f7e70c91f3a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.684477 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-db-sync-config-data\") pod \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.684604 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-config-data\") pod \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.684712 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5stsk\" (UniqueName: \"kubernetes.io/projected/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-kube-api-access-5stsk\") pod \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.684744 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-combined-ca-bundle\") pod \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\" (UID: \"c68042ef-a515-4c4a-b1d5-82fcc3549ea6\") " Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.685081 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.685095 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv96b\" (UniqueName: \"kubernetes.io/projected/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-kube-api-access-hv96b\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.685106 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.687383 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c68042ef-a515-4c4a-b1d5-82fcc3549ea6" (UID: "c68042ef-a515-4c4a-b1d5-82fcc3549ea6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.687823 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-kube-api-access-5stsk" (OuterVolumeSpecName: "kube-api-access-5stsk") pod "c68042ef-a515-4c4a-b1d5-82fcc3549ea6" (UID: "c68042ef-a515-4c4a-b1d5-82fcc3549ea6"). InnerVolumeSpecName "kube-api-access-5stsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.704946 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c68042ef-a515-4c4a-b1d5-82fcc3549ea6" (UID: "c68042ef-a515-4c4a-b1d5-82fcc3549ea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.731184 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-config-data" (OuterVolumeSpecName: "config-data") pod "c68042ef-a515-4c4a-b1d5-82fcc3549ea6" (UID: "c68042ef-a515-4c4a-b1d5-82fcc3549ea6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.787712 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.787743 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5stsk\" (UniqueName: \"kubernetes.io/projected/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-kube-api-access-5stsk\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.787786 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:39 crc kubenswrapper[4717]: I1007 14:11:39.787797 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c68042ef-a515-4c4a-b1d5-82fcc3549ea6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.148858 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jkk7w" event={"ID":"c68042ef-a515-4c4a-b1d5-82fcc3549ea6","Type":"ContainerDied","Data":"65b0ebb33d92dcc101d9675b038a7f3f39de4ba865b51db2b7bb84a65833ae56"} Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.148891 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jkk7w" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.148897 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65b0ebb33d92dcc101d9675b038a7f3f39de4ba865b51db2b7bb84a65833ae56" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.151581 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-8kmdr" event={"ID":"842ad804-63f4-4fb3-9f2e-f7e70c91f3a5","Type":"ContainerDied","Data":"88f0528b306c6b18f96390233167093c74b33f6cc65c3d63e22ff2c7aa68f77a"} Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.151604 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88f0528b306c6b18f96390233167093c74b33f6cc65c3d63e22ff2c7aa68f77a" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.151629 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-8kmdr" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.570642 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-tt2s2"] Oct 07 14:11:40 crc kubenswrapper[4717]: E1007 14:11:40.571232 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" containerName="dnsmasq-dns" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.571250 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" containerName="dnsmasq-dns" Oct 07 14:11:40 crc kubenswrapper[4717]: E1007 14:11:40.571267 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" containerName="init" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.571275 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" containerName="init" Oct 07 14:11:40 crc kubenswrapper[4717]: E1007 14:11:40.571288 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c14c91-5e26-48d3-86c8-79dcb89d3c16" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.571296 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c14c91-5e26-48d3-86c8-79dcb89d3c16" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: E1007 14:11:40.571306 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e60f513f-7654-4b4a-b4ef-ea4637a7f364" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.571314 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60f513f-7654-4b4a-b4ef-ea4637a7f364" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: E1007 14:11:40.571338 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842ad804-63f4-4fb3-9f2e-f7e70c91f3a5" containerName="keystone-db-sync" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.571345 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="842ad804-63f4-4fb3-9f2e-f7e70c91f3a5" containerName="keystone-db-sync" Oct 07 14:11:40 crc kubenswrapper[4717]: E1007 14:11:40.571357 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193ab1fd-3c4e-463b-9f5f-a85cbe1fd633" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.571364 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="193ab1fd-3c4e-463b-9f5f-a85cbe1fd633" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: E1007 14:11:40.571377 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c68042ef-a515-4c4a-b1d5-82fcc3549ea6" containerName="glance-db-sync" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.571383 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c68042ef-a515-4c4a-b1d5-82fcc3549ea6" containerName="glance-db-sync" Oct 07 14:11:40 crc kubenswrapper[4717]: E1007 14:11:40.571397 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e05b05a1-ebbd-4437-aa14-2b6736950306" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.571404 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e05b05a1-ebbd-4437-aa14-2b6736950306" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.571568 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e60f513f-7654-4b4a-b4ef-ea4637a7f364" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.571583 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="842ad804-63f4-4fb3-9f2e-f7e70c91f3a5" containerName="keystone-db-sync" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.572729 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="193ab1fd-3c4e-463b-9f5f-a85cbe1fd633" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.572756 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c68042ef-a515-4c4a-b1d5-82fcc3549ea6" containerName="glance-db-sync" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.572792 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e05b05a1-ebbd-4437-aa14-2b6736950306" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.572812 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c14c91-5e26-48d3-86c8-79dcb89d3c16" containerName="mariadb-database-create" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.572828 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e24f9c-97b5-49a4-83d8-5f0c0f3b70ee" containerName="dnsmasq-dns" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.573950 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.601447 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-tt2s2"] Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.675762 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qtb44"] Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.676883 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.679725 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.680162 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.681353 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.682638 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m4h7z" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.683432 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-tt2s2"] Oct 07 14:11:40 crc kubenswrapper[4717]: E1007 14:11:40.683940 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-lslvx ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" podUID="1be76194-afbc-4d69-a5a0-8ad4e42c7066" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.701847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.701887 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lslvx\" (UniqueName: \"kubernetes.io/projected/1be76194-afbc-4d69-a5a0-8ad4e42c7066-kube-api-access-lslvx\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.701955 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.701984 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.702132 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-config\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.702164 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.710314 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qtb44"] Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.755707 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9"] Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.758283 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.773938 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9"] Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.804999 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-combined-ca-bundle\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805088 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-config-data\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805110 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-scripts\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805136 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-config\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805152 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805212 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-credential-keys\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805237 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805252 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lslvx\" (UniqueName: \"kubernetes.io/projected/1be76194-afbc-4d69-a5a0-8ad4e42c7066-kube-api-access-lslvx\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805269 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96c6d\" (UniqueName: \"kubernetes.io/projected/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-kube-api-access-96c6d\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805325 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805350 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.805379 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-fernet-keys\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.806237 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-config\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.806843 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.807411 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.807930 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.808569 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.835900 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lslvx\" (UniqueName: \"kubernetes.io/projected/1be76194-afbc-4d69-a5a0-8ad4e42c7066-kube-api-access-lslvx\") pod \"dnsmasq-dns-7ff5475cc9-tt2s2\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.852789 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-747d686857-svzsx"] Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.878102 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-747d686857-svzsx"] Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.884202 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.887944 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.888282 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-gdlq4" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.888878 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.889081 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908437 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908506 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908529 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-credential-keys\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908549 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96c6d\" (UniqueName: \"kubernetes.io/projected/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-kube-api-access-96c6d\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908602 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-config\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908627 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908656 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908684 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-fernet-keys\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtjs6\" (UniqueName: \"kubernetes.io/projected/b10b0d37-662c-4559-9bc2-adb424cae014-kube-api-access-xtjs6\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908728 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-combined-ca-bundle\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908744 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-config-data\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.908760 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-scripts\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.917144 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-fernet-keys\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.922632 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-combined-ca-bundle\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.923452 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-scripts\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.928806 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-config-data\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.937516 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-credential-keys\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.938579 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96c6d\" (UniqueName: \"kubernetes.io/projected/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-kube-api-access-96c6d\") pod \"keystone-bootstrap-qtb44\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.960675 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.962929 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.970224 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.970428 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:11:40 crc kubenswrapper[4717]: I1007 14:11:40.986862 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.003157 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.029864 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-scripts\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.030201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-config\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.030233 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-horizon-secret-key\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.030255 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.030341 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.030369 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbdw7\" (UniqueName: \"kubernetes.io/projected/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-kube-api-access-jbdw7\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.030426 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtjs6\" (UniqueName: \"kubernetes.io/projected/b10b0d37-662c-4559-9bc2-adb424cae014-kube-api-access-xtjs6\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.030485 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-config-data\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.030518 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.030583 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-logs\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.030629 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.031458 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.031966 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-config\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.032517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.033140 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.033182 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.060499 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.062552 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.069488 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtjs6\" (UniqueName: \"kubernetes.io/projected/b10b0d37-662c-4559-9bc2-adb424cae014-kube-api-access-xtjs6\") pod \"dnsmasq-dns-5c5cc7c5ff-l5rp9\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.070185 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.070186 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.070477 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-qgtbp" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.070674 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.079601 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.096150 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.104682 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b867bb977-pb7h8"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.106064 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.121722 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b867bb977-pb7h8"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.132605 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbdw7\" (UniqueName: \"kubernetes.io/projected/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-kube-api-access-jbdw7\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.133627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-config-data\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.133748 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkjss\" (UniqueName: \"kubernetes.io/projected/6d8f021f-8b1b-4a30-80a4-b01a299c734f-kube-api-access-rkjss\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.133882 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.133999 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-log-httpd\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.134160 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.134297 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-logs\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.134426 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-run-httpd\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.134631 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-scripts\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.134995 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-scripts\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.141647 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-horizon-secret-key\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.141904 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-config-data\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.141480 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-scripts\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.135646 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-logs\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.135854 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-config-data\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.150161 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-qmft2"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.151407 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.154474 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.154767 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.154880 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nx2x7" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.155396 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-horizon-secret-key\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.160597 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.171140 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbdw7\" (UniqueName: \"kubernetes.io/projected/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-kube-api-access-jbdw7\") pod \"horizon-747d686857-svzsx\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.177836 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.186455 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-qmft2"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.200968 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.202349 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.204807 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.207726 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.212109 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-svjc9"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.214077 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.225139 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-svjc9"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.232832 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249321 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-scripts\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249367 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249387 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249407 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-scripts\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249458 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-scripts\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249483 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-config-data\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249551 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/362407be-b43c-4a74-8b08-d22518d2b6b4-horizon-secret-key\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249580 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-config-data\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249617 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/362407be-b43c-4a74-8b08-d22518d2b6b4-logs\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-combined-ca-bundle\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249693 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkjss\" (UniqueName: \"kubernetes.io/projected/6d8f021f-8b1b-4a30-80a4-b01a299c734f-kube-api-access-rkjss\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249721 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249743 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-config-data\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249763 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-log-httpd\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249787 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfjjl\" (UniqueName: \"kubernetes.io/projected/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-kube-api-access-qfjjl\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249812 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249838 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249878 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwppp\" (UniqueName: \"kubernetes.io/projected/362407be-b43c-4a74-8b08-d22518d2b6b4-kube-api-access-nwppp\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249904 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249921 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-logs\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249944 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-logs\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249967 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sp9q\" (UniqueName: \"kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-kube-api-access-7sp9q\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.249992 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-ceph\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.250059 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-run-httpd\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.250486 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-run-httpd\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.250896 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-log-httpd\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.256664 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.256803 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-scripts\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.270743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.284345 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-config-data\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.290026 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkjss\" (UniqueName: \"kubernetes.io/projected/6d8f021f-8b1b-4a30-80a4-b01a299c734f-kube-api-access-rkjss\") pod \"ceilometer-0\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.312727 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-747d686857-svzsx" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.339289 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.351362 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-sb\") pod \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.351539 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-config\") pod \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.351701 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lslvx\" (UniqueName: \"kubernetes.io/projected/1be76194-afbc-4d69-a5a0-8ad4e42c7066-kube-api-access-lslvx\") pod \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.351815 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-svc\") pod \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.351878 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-nb\") pod \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.351906 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-swift-storage-0\") pod \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\" (UID: \"1be76194-afbc-4d69-a5a0-8ad4e42c7066\") " Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352049 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-config" (OuterVolumeSpecName: "config") pod "1be76194-afbc-4d69-a5a0-8ad4e42c7066" (UID: "1be76194-afbc-4d69-a5a0-8ad4e42c7066"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352171 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d87vv\" (UniqueName: \"kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-kube-api-access-d87vv\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-scripts\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352222 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352238 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352254 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-scripts\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352284 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppt87\" (UniqueName: \"kubernetes.io/projected/1396fa07-0a19-481d-ae2b-f943755ea2ad-kube-api-access-ppt87\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352309 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352330 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352356 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-config\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352382 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/362407be-b43c-4a74-8b08-d22518d2b6b4-horizon-secret-key\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352377 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1be76194-afbc-4d69-a5a0-8ad4e42c7066" (UID: "1be76194-afbc-4d69-a5a0-8ad4e42c7066"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352400 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-config-data\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352421 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352459 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1be76194-afbc-4d69-a5a0-8ad4e42c7066" (UID: "1be76194-afbc-4d69-a5a0-8ad4e42c7066"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352518 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352648 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/362407be-b43c-4a74-8b08-d22518d2b6b4-logs\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352654 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1be76194-afbc-4d69-a5a0-8ad4e42c7066" (UID: "1be76194-afbc-4d69-a5a0-8ad4e42c7066"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.352716 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1be76194-afbc-4d69-a5a0-8ad4e42c7066" (UID: "1be76194-afbc-4d69-a5a0-8ad4e42c7066"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353195 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/362407be-b43c-4a74-8b08-d22518d2b6b4-logs\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353198 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353229 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353262 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353618 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-combined-ca-bundle\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353667 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-scripts\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353742 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353836 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-config-data\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353867 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfjjl\" (UniqueName: \"kubernetes.io/projected/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-kube-api-access-qfjjl\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353887 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353920 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-logs\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353938 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.353988 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.354071 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.354117 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwppp\" (UniqueName: \"kubernetes.io/projected/362407be-b43c-4a74-8b08-d22518d2b6b4-kube-api-access-nwppp\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.354404 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.354568 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-config-data\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.354647 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-logs\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.354702 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-logs\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355120 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-logs\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355127 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-logs\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355195 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sp9q\" (UniqueName: \"kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-kube-api-access-7sp9q\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355414 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355466 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355495 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-ceph\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355573 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355585 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355597 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355611 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.355622 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1be76194-afbc-4d69-a5a0-8ad4e42c7066-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.358536 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/362407be-b43c-4a74-8b08-d22518d2b6b4-horizon-secret-key\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.358850 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.368250 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-ceph\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.370883 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-scripts\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.373748 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1be76194-afbc-4d69-a5a0-8ad4e42c7066-kube-api-access-lslvx" (OuterVolumeSpecName: "kube-api-access-lslvx") pod "1be76194-afbc-4d69-a5a0-8ad4e42c7066" (UID: "1be76194-afbc-4d69-a5a0-8ad4e42c7066"). InnerVolumeSpecName "kube-api-access-lslvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.374893 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.375222 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwppp\" (UniqueName: \"kubernetes.io/projected/362407be-b43c-4a74-8b08-d22518d2b6b4-kube-api-access-nwppp\") pod \"horizon-7b867bb977-pb7h8\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.375584 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-config-data\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.375593 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sp9q\" (UniqueName: \"kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-kube-api-access-7sp9q\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.381476 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-combined-ca-bundle\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.388871 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfjjl\" (UniqueName: \"kubernetes.io/projected/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-kube-api-access-qfjjl\") pod \"placement-db-sync-qmft2\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.398569 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.405600 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.427733 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.448235 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.458710 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.458757 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-logs\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.458787 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.458806 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.458830 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.458852 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.458874 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d87vv\" (UniqueName: \"kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-kube-api-access-d87vv\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.458915 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppt87\" (UniqueName: \"kubernetes.io/projected/1396fa07-0a19-481d-ae2b-f943755ea2ad-kube-api-access-ppt87\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.458950 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.458975 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-config\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.459013 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.459035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.459053 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.459075 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.459120 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lslvx\" (UniqueName: \"kubernetes.io/projected/1be76194-afbc-4d69-a5a0-8ad4e42c7066-kube-api-access-lslvx\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.460541 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.460855 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-logs\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.461410 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-config\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.465601 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.466238 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.466918 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.468177 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.468513 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.469151 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.469531 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qmft2" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.471359 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.479206 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.479649 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.488463 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d87vv\" (UniqueName: \"kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-kube-api-access-d87vv\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.499961 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppt87\" (UniqueName: \"kubernetes.io/projected/1396fa07-0a19-481d-ae2b-f943755ea2ad-kube-api-access-ppt87\") pod \"dnsmasq-dns-8b5c85b87-svjc9\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.536726 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.549171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.669888 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qtb44"] Oct 07 14:11:41 crc kubenswrapper[4717]: W1007 14:11:41.701694 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b4ae6a5_dd6b_46d5_8cca_0f6a7655f655.slice/crio-045680ecd573881c4c5177330c03ea89e50a83dda0ac0671addf86b8d6e5843f WatchSource:0}: Error finding container 045680ecd573881c4c5177330c03ea89e50a83dda0ac0671addf86b8d6e5843f: Status 404 returned error can't find the container with id 045680ecd573881c4c5177330c03ea89e50a83dda0ac0671addf86b8d6e5843f Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.746611 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9"] Oct 07 14:11:41 crc kubenswrapper[4717]: W1007 14:11:41.766973 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb10b0d37_662c_4559_9bc2_adb424cae014.slice/crio-28291a476225ae5b44946b9843eae7728ba13a2a7c08d45a918613c6b4b5325e WatchSource:0}: Error finding container 28291a476225ae5b44946b9843eae7728ba13a2a7c08d45a918613c6b4b5325e: Status 404 returned error can't find the container with id 28291a476225ae5b44946b9843eae7728ba13a2a7c08d45a918613c6b4b5325e Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.821225 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:11:41 crc kubenswrapper[4717]: I1007 14:11:41.880568 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-747d686857-svzsx"] Oct 07 14:11:42 crc kubenswrapper[4717]: W1007 14:11:42.040681 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d8f021f_8b1b_4a30_80a4_b01a299c734f.slice/crio-8ab925f1099f4dd7df528bdf628e09192f89ce11149ca6ddeba9aa3f3dc9916c WatchSource:0}: Error finding container 8ab925f1099f4dd7df528bdf628e09192f89ce11149ca6ddeba9aa3f3dc9916c: Status 404 returned error can't find the container with id 8ab925f1099f4dd7df528bdf628e09192f89ce11149ca6ddeba9aa3f3dc9916c Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.042641 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.054790 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-qmft2"] Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.115736 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b867bb977-pb7h8"] Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.126488 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-svjc9"] Oct 07 14:11:42 crc kubenswrapper[4717]: W1007 14:11:42.130401 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod362407be_b43c_4a74_8b08_d22518d2b6b4.slice/crio-0357e296d3c91b2f3a8d7ff569c15436f61eb34828820e8496a45f1672420c16 WatchSource:0}: Error finding container 0357e296d3c91b2f3a8d7ff569c15436f61eb34828820e8496a45f1672420c16: Status 404 returned error can't find the container with id 0357e296d3c91b2f3a8d7ff569c15436f61eb34828820e8496a45f1672420c16 Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.185664 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8f021f-8b1b-4a30-80a4-b01a299c734f","Type":"ContainerStarted","Data":"8ab925f1099f4dd7df528bdf628e09192f89ce11149ca6ddeba9aa3f3dc9916c"} Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.191942 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qmft2" event={"ID":"eb1a5ed9-e123-447d-a56d-e0cce35eb56a","Type":"ContainerStarted","Data":"abd0d61d8620a77edcb9f1be000cf5e05efca5337bb190b804197adc26169166"} Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.194205 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" event={"ID":"1396fa07-0a19-481d-ae2b-f943755ea2ad","Type":"ContainerStarted","Data":"5532342dc50cb1a4fd174e1ca67b50cbb9464b4138e1445899ca4670d3ac7489"} Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.197892 4717 generic.go:334] "Generic (PLEG): container finished" podID="b10b0d37-662c-4559-9bc2-adb424cae014" containerID="742be67ac66f61f8f6c2d8fc5e92f929efc939cd7f4b271f18c1f22539ab3cdf" exitCode=0 Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.197949 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" event={"ID":"b10b0d37-662c-4559-9bc2-adb424cae014","Type":"ContainerDied","Data":"742be67ac66f61f8f6c2d8fc5e92f929efc939cd7f4b271f18c1f22539ab3cdf"} Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.197990 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" event={"ID":"b10b0d37-662c-4559-9bc2-adb424cae014","Type":"ContainerStarted","Data":"28291a476225ae5b44946b9843eae7728ba13a2a7c08d45a918613c6b4b5325e"} Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.201409 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qtb44" event={"ID":"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655","Type":"ContainerStarted","Data":"154cac6ff0aa1e601735fe658a0b5326673b4765554107198a72eb49227ad728"} Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.201444 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qtb44" event={"ID":"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655","Type":"ContainerStarted","Data":"045680ecd573881c4c5177330c03ea89e50a83dda0ac0671addf86b8d6e5843f"} Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.203660 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b867bb977-pb7h8" event={"ID":"362407be-b43c-4a74-8b08-d22518d2b6b4","Type":"ContainerStarted","Data":"0357e296d3c91b2f3a8d7ff569c15436f61eb34828820e8496a45f1672420c16"} Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.204663 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-tt2s2" Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.204670 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-747d686857-svzsx" event={"ID":"6585c6e8-1873-4e2a-ba9c-9590fc162dcb","Type":"ContainerStarted","Data":"32524a10b6f0bed64626265f7a50864d07c4ab95f222628f38cb7cd7e03dd9dd"} Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.244259 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qtb44" podStartSLOduration=2.244227292 podStartE2EDuration="2.244227292s" podCreationTimestamp="2025-10-07 14:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:42.240466708 +0000 UTC m=+1084.068392500" watchObservedRunningTime="2025-10-07 14:11:42.244227292 +0000 UTC m=+1084.072153084" Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.288088 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-tt2s2"] Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.296994 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-tt2s2"] Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.351167 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:11:42 crc kubenswrapper[4717]: W1007 14:11:42.359290 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a8a05c1_27f7_446e_b49e_1762d51644a5.slice/crio-e0b8da7378bfb191e91dc012c1a3d72e02d25a5f9c46da932b23297f30abd98a WatchSource:0}: Error finding container e0b8da7378bfb191e91dc012c1a3d72e02d25a5f9c46da932b23297f30abd98a: Status 404 returned error can't find the container with id e0b8da7378bfb191e91dc012c1a3d72e02d25a5f9c46da932b23297f30abd98a Oct 07 14:11:42 crc kubenswrapper[4717]: I1007 14:11:42.880532 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1be76194-afbc-4d69-a5a0-8ad4e42c7066" path="/var/lib/kubelet/pods/1be76194-afbc-4d69-a5a0-8ad4e42c7066/volumes" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.149673 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.277276 4717 generic.go:334] "Generic (PLEG): container finished" podID="1396fa07-0a19-481d-ae2b-f943755ea2ad" containerID="0c4aa575a987e95950b9dc33620acd3e9da9eef5ffc4eeb6b40aacfefdc85574" exitCode=0 Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.277367 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" event={"ID":"1396fa07-0a19-481d-ae2b-f943755ea2ad","Type":"ContainerDied","Data":"0c4aa575a987e95950b9dc33620acd3e9da9eef5ffc4eeb6b40aacfefdc85574"} Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.305201 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-sb\") pod \"b10b0d37-662c-4559-9bc2-adb424cae014\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.305397 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-nb\") pod \"b10b0d37-662c-4559-9bc2-adb424cae014\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.305582 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-swift-storage-0\") pod \"b10b0d37-662c-4559-9bc2-adb424cae014\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.305717 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-config\") pod \"b10b0d37-662c-4559-9bc2-adb424cae014\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.306054 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtjs6\" (UniqueName: \"kubernetes.io/projected/b10b0d37-662c-4559-9bc2-adb424cae014-kube-api-access-xtjs6\") pod \"b10b0d37-662c-4559-9bc2-adb424cae014\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.306163 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-svc\") pod \"b10b0d37-662c-4559-9bc2-adb424cae014\" (UID: \"b10b0d37-662c-4559-9bc2-adb424cae014\") " Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.336729 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b10b0d37-662c-4559-9bc2-adb424cae014-kube-api-access-xtjs6" (OuterVolumeSpecName: "kube-api-access-xtjs6") pod "b10b0d37-662c-4559-9bc2-adb424cae014" (UID: "b10b0d37-662c-4559-9bc2-adb424cae014"). InnerVolumeSpecName "kube-api-access-xtjs6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.378829 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b10b0d37-662c-4559-9bc2-adb424cae014" (UID: "b10b0d37-662c-4559-9bc2-adb424cae014"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.381631 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.381653 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9" event={"ID":"b10b0d37-662c-4559-9bc2-adb424cae014","Type":"ContainerDied","Data":"28291a476225ae5b44946b9843eae7728ba13a2a7c08d45a918613c6b4b5325e"} Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.381699 4717 scope.go:117] "RemoveContainer" containerID="742be67ac66f61f8f6c2d8fc5e92f929efc939cd7f4b271f18c1f22539ab3cdf" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.387814 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b10b0d37-662c-4559-9bc2-adb424cae014" (UID: "b10b0d37-662c-4559-9bc2-adb424cae014"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.397843 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a8a05c1-27f7-446e-b49e-1762d51644a5","Type":"ContainerStarted","Data":"10bdfcd3afba85494b2aac3c5e93d1e60b06abbc51bb3018f176f583a653c185"} Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.397884 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a8a05c1-27f7-446e-b49e-1762d51644a5","Type":"ContainerStarted","Data":"e0b8da7378bfb191e91dc012c1a3d72e02d25a5f9c46da932b23297f30abd98a"} Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.408660 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtjs6\" (UniqueName: \"kubernetes.io/projected/b10b0d37-662c-4559-9bc2-adb424cae014-kube-api-access-xtjs6\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.408691 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.408701 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.412188 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b10b0d37-662c-4559-9bc2-adb424cae014" (UID: "b10b0d37-662c-4559-9bc2-adb424cae014"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.465856 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-config" (OuterVolumeSpecName: "config") pod "b10b0d37-662c-4559-9bc2-adb424cae014" (UID: "b10b0d37-662c-4559-9bc2-adb424cae014"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.480814 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.489578 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b10b0d37-662c-4559-9bc2-adb424cae014" (UID: "b10b0d37-662c-4559-9bc2-adb424cae014"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.509836 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.510233 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.510243 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b10b0d37-662c-4559-9bc2-adb424cae014-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.807567 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9"] Oct 07 14:11:43 crc kubenswrapper[4717]: I1007 14:11:43.813985 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-l5rp9"] Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.249102 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-7b91-account-create-5dp6f"] Oct 07 14:11:44 crc kubenswrapper[4717]: E1007 14:11:44.249784 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b10b0d37-662c-4559-9bc2-adb424cae014" containerName="init" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.249797 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b10b0d37-662c-4559-9bc2-adb424cae014" containerName="init" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.249988 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b10b0d37-662c-4559-9bc2-adb424cae014" containerName="init" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.250575 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7b91-account-create-5dp6f" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.259552 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.272856 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-7b91-account-create-5dp6f"] Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.329956 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49c7\" (UniqueName: \"kubernetes.io/projected/ce8237ed-de9c-4c53-a716-de316c22554b-kube-api-access-b49c7\") pod \"manila-7b91-account-create-5dp6f\" (UID: \"ce8237ed-de9c-4c53-a716-de316c22554b\") " pod="openstack/manila-7b91-account-create-5dp6f" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.356074 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e87f-account-create-m8mwt"] Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.357289 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e87f-account-create-m8mwt" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.359200 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.384446 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e87f-account-create-m8mwt"] Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.432592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b49c7\" (UniqueName: \"kubernetes.io/projected/ce8237ed-de9c-4c53-a716-de316c22554b-kube-api-access-b49c7\") pod \"manila-7b91-account-create-5dp6f\" (UID: \"ce8237ed-de9c-4c53-a716-de316c22554b\") " pod="openstack/manila-7b91-account-create-5dp6f" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.432754 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sb26\" (UniqueName: \"kubernetes.io/projected/0bda80bb-339d-4c40-a8cf-91dc994bcc15-kube-api-access-8sb26\") pod \"barbican-e87f-account-create-m8mwt\" (UID: \"0bda80bb-339d-4c40-a8cf-91dc994bcc15\") " pod="openstack/barbican-e87f-account-create-m8mwt" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.439222 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" event={"ID":"1396fa07-0a19-481d-ae2b-f943755ea2ad","Type":"ContainerStarted","Data":"42c716a25b3cc555eca309b639df43746cbd0c5c1d2f7c2dee42b6a1b9a64e60"} Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.439701 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.450326 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a8a05c1-27f7-446e-b49e-1762d51644a5","Type":"ContainerStarted","Data":"6402001a0ffaa057bb6d952afb0f3a18f21b73701900e673f04948e10a7fc36e"} Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.454418 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cdb138bd-7439-4632-b25b-457048de3859","Type":"ContainerStarted","Data":"9dec420108e23a1a14970e93f6d13ee1030777c92ec41ab914b92723a7cbae06"} Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.454471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cdb138bd-7439-4632-b25b-457048de3859","Type":"ContainerStarted","Data":"c80129167453242bad9d500814169c954e888fae1b9501ae4b88adcea2ed6eb8"} Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.464194 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" podStartSLOduration=3.46416976 podStartE2EDuration="3.46416976s" podCreationTimestamp="2025-10-07 14:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:44.455889531 +0000 UTC m=+1086.283815323" watchObservedRunningTime="2025-10-07 14:11:44.46416976 +0000 UTC m=+1086.292095552" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.464761 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49c7\" (UniqueName: \"kubernetes.io/projected/ce8237ed-de9c-4c53-a716-de316c22554b-kube-api-access-b49c7\") pod \"manila-7b91-account-create-5dp6f\" (UID: \"ce8237ed-de9c-4c53-a716-de316c22554b\") " pod="openstack/manila-7b91-account-create-5dp6f" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.493500 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.493479492 podStartE2EDuration="3.493479492s" podCreationTimestamp="2025-10-07 14:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:44.485023688 +0000 UTC m=+1086.312949480" watchObservedRunningTime="2025-10-07 14:11:44.493479492 +0000 UTC m=+1086.321405284" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.535240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sb26\" (UniqueName: \"kubernetes.io/projected/0bda80bb-339d-4c40-a8cf-91dc994bcc15-kube-api-access-8sb26\") pod \"barbican-e87f-account-create-m8mwt\" (UID: \"0bda80bb-339d-4c40-a8cf-91dc994bcc15\") " pod="openstack/barbican-e87f-account-create-m8mwt" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.556790 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sb26\" (UniqueName: \"kubernetes.io/projected/0bda80bb-339d-4c40-a8cf-91dc994bcc15-kube-api-access-8sb26\") pod \"barbican-e87f-account-create-m8mwt\" (UID: \"0bda80bb-339d-4c40-a8cf-91dc994bcc15\") " pod="openstack/barbican-e87f-account-create-m8mwt" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.569535 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7b91-account-create-5dp6f" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.657644 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8cd3-account-create-h8gpd"] Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.659439 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8cd3-account-create-h8gpd" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.663316 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.669029 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8cd3-account-create-h8gpd"] Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.726375 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e87f-account-create-m8mwt" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.738353 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9gzz\" (UniqueName: \"kubernetes.io/projected/9dc43ea6-a0ae-48a5-9c55-4b932864ea43-kube-api-access-s9gzz\") pod \"cinder-8cd3-account-create-h8gpd\" (UID: \"9dc43ea6-a0ae-48a5-9c55-4b932864ea43\") " pod="openstack/cinder-8cd3-account-create-h8gpd" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.770224 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4f33-account-create-rpvq6"] Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.771631 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4f33-account-create-rpvq6" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.774509 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.789022 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4f33-account-create-rpvq6"] Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.840469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9gzz\" (UniqueName: \"kubernetes.io/projected/9dc43ea6-a0ae-48a5-9c55-4b932864ea43-kube-api-access-s9gzz\") pod \"cinder-8cd3-account-create-h8gpd\" (UID: \"9dc43ea6-a0ae-48a5-9c55-4b932864ea43\") " pod="openstack/cinder-8cd3-account-create-h8gpd" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.840560 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdkzz\" (UniqueName: \"kubernetes.io/projected/1189ffe2-55f3-4da5-9cf5-2a871828cfa3-kube-api-access-sdkzz\") pod \"neutron-4f33-account-create-rpvq6\" (UID: \"1189ffe2-55f3-4da5-9cf5-2a871828cfa3\") " pod="openstack/neutron-4f33-account-create-rpvq6" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.857875 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9gzz\" (UniqueName: \"kubernetes.io/projected/9dc43ea6-a0ae-48a5-9c55-4b932864ea43-kube-api-access-s9gzz\") pod \"cinder-8cd3-account-create-h8gpd\" (UID: \"9dc43ea6-a0ae-48a5-9c55-4b932864ea43\") " pod="openstack/cinder-8cd3-account-create-h8gpd" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.913754 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b10b0d37-662c-4559-9bc2-adb424cae014" path="/var/lib/kubelet/pods/b10b0d37-662c-4559-9bc2-adb424cae014/volumes" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.942125 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdkzz\" (UniqueName: \"kubernetes.io/projected/1189ffe2-55f3-4da5-9cf5-2a871828cfa3-kube-api-access-sdkzz\") pod \"neutron-4f33-account-create-rpvq6\" (UID: \"1189ffe2-55f3-4da5-9cf5-2a871828cfa3\") " pod="openstack/neutron-4f33-account-create-rpvq6" Oct 07 14:11:44 crc kubenswrapper[4717]: I1007 14:11:44.962302 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdkzz\" (UniqueName: \"kubernetes.io/projected/1189ffe2-55f3-4da5-9cf5-2a871828cfa3-kube-api-access-sdkzz\") pod \"neutron-4f33-account-create-rpvq6\" (UID: \"1189ffe2-55f3-4da5-9cf5-2a871828cfa3\") " pod="openstack/neutron-4f33-account-create-rpvq6" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.005226 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8cd3-account-create-h8gpd" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.126451 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4f33-account-create-rpvq6" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.149943 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-7b91-account-create-5dp6f"] Oct 07 14:11:45 crc kubenswrapper[4717]: W1007 14:11:45.166846 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce8237ed_de9c_4c53_a716_de316c22554b.slice/crio-790fc973a46fbf6f45804552cdc26f6b5196fda8a6793b2953284f003f03ec92 WatchSource:0}: Error finding container 790fc973a46fbf6f45804552cdc26f6b5196fda8a6793b2953284f003f03ec92: Status 404 returned error can't find the container with id 790fc973a46fbf6f45804552cdc26f6b5196fda8a6793b2953284f003f03ec92 Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.284160 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e87f-account-create-m8mwt"] Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.357859 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:11:45 crc kubenswrapper[4717]: W1007 14:11:45.368999 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bda80bb_339d_4c40_a8cf_91dc994bcc15.slice/crio-7f25908bf46f5502a713bc44197c5f4fd36dea8b55f64b8201b82099d03003ec WatchSource:0}: Error finding container 7f25908bf46f5502a713bc44197c5f4fd36dea8b55f64b8201b82099d03003ec: Status 404 returned error can't find the container with id 7f25908bf46f5502a713bc44197c5f4fd36dea8b55f64b8201b82099d03003ec Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.437175 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-747d686857-svzsx"] Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.451484 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.499984 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-766954cfd9-kkf7x"] Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.531403 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.551340 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-766954cfd9-kkf7x"] Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.569438 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8056f9f9-b82b-4309-a225-241d2a7ba680-logs\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.569509 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhzm7\" (UniqueName: \"kubernetes.io/projected/8056f9f9-b82b-4309-a225-241d2a7ba680-kube-api-access-xhzm7\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.570142 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-scripts\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.570186 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8056f9f9-b82b-4309-a225-241d2a7ba680-horizon-secret-key\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.570406 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-config-data\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.576631 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8cd3-account-create-h8gpd"] Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.613504 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e87f-account-create-m8mwt" event={"ID":"0bda80bb-339d-4c40-a8cf-91dc994bcc15","Type":"ContainerStarted","Data":"7f25908bf46f5502a713bc44197c5f4fd36dea8b55f64b8201b82099d03003ec"} Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.639164 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-7b91-account-create-5dp6f" event={"ID":"ce8237ed-de9c-4c53-a716-de316c22554b","Type":"ContainerStarted","Data":"790fc973a46fbf6f45804552cdc26f6b5196fda8a6793b2953284f003f03ec92"} Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.671157 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.673164 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-scripts\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.673215 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8056f9f9-b82b-4309-a225-241d2a7ba680-horizon-secret-key\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.673269 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-config-data\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.673341 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8056f9f9-b82b-4309-a225-241d2a7ba680-logs\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.673367 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhzm7\" (UniqueName: \"kubernetes.io/projected/8056f9f9-b82b-4309-a225-241d2a7ba680-kube-api-access-xhzm7\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.675213 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-scripts\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.677555 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8056f9f9-b82b-4309-a225-241d2a7ba680-logs\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.683065 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-config-data\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.692517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhzm7\" (UniqueName: \"kubernetes.io/projected/8056f9f9-b82b-4309-a225-241d2a7ba680-kube-api-access-xhzm7\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.709526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8056f9f9-b82b-4309-a225-241d2a7ba680-horizon-secret-key\") pod \"horizon-766954cfd9-kkf7x\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.713536 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-7b91-account-create-5dp6f" podStartSLOduration=1.7135064610000001 podStartE2EDuration="1.713506461s" podCreationTimestamp="2025-10-07 14:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:45.661566692 +0000 UTC m=+1087.489492494" watchObservedRunningTime="2025-10-07 14:11:45.713506461 +0000 UTC m=+1087.541432253" Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.856237 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4f33-account-create-rpvq6"] Oct 07 14:11:45 crc kubenswrapper[4717]: I1007 14:11:45.865613 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.496226 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-766954cfd9-kkf7x"] Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.650112 4717 generic.go:334] "Generic (PLEG): container finished" podID="ce8237ed-de9c-4c53-a716-de316c22554b" containerID="a03122b0c53e442dfa9ea67095f432f82a90d23d95cc4352ee506aa1cd33b724" exitCode=0 Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.650205 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-7b91-account-create-5dp6f" event={"ID":"ce8237ed-de9c-4c53-a716-de316c22554b","Type":"ContainerDied","Data":"a03122b0c53e442dfa9ea67095f432f82a90d23d95cc4352ee506aa1cd33b724"} Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.652037 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-766954cfd9-kkf7x" event={"ID":"8056f9f9-b82b-4309-a225-241d2a7ba680","Type":"ContainerStarted","Data":"bb058dd907c6d136b448a167a7ebba016e2380acd6679d17ef4cda5da83c9683"} Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.654043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4f33-account-create-rpvq6" event={"ID":"1189ffe2-55f3-4da5-9cf5-2a871828cfa3","Type":"ContainerStarted","Data":"0a8050a8cd4d044ad4d231288b090a4cfc07e20d794bdee8a68820d57518ba75"} Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.654082 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4f33-account-create-rpvq6" event={"ID":"1189ffe2-55f3-4da5-9cf5-2a871828cfa3","Type":"ContainerStarted","Data":"81f32758e205f13d86ea734042faba5c674e4d36fbc23de3cca27001fe50c36a"} Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.656337 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e87f-account-create-m8mwt" event={"ID":"0bda80bb-339d-4c40-a8cf-91dc994bcc15","Type":"ContainerStarted","Data":"802349aec5f093427e75c76200160d57a5a7989e456b9f437e17268d078c350a"} Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.658163 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8cd3-account-create-h8gpd" event={"ID":"9dc43ea6-a0ae-48a5-9c55-4b932864ea43","Type":"ContainerStarted","Data":"8094c3c8c4396c5d8ee7febbf5a60ef36df9e85e71eefeefacc7b025a4c3860d"} Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.658189 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8cd3-account-create-h8gpd" event={"ID":"9dc43ea6-a0ae-48a5-9c55-4b932864ea43","Type":"ContainerStarted","Data":"4519235d173db255d1eee939782bb1582086bd803310d25f344df600054a8deb"} Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.660965 4717 generic.go:334] "Generic (PLEG): container finished" podID="2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" containerID="154cac6ff0aa1e601735fe658a0b5326673b4765554107198a72eb49227ad728" exitCode=0 Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.660964 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qtb44" event={"ID":"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655","Type":"ContainerDied","Data":"154cac6ff0aa1e601735fe658a0b5326673b4765554107198a72eb49227ad728"} Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.670860 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7a8a05c1-27f7-446e-b49e-1762d51644a5" containerName="glance-log" containerID="cri-o://10bdfcd3afba85494b2aac3c5e93d1e60b06abbc51bb3018f176f583a653c185" gracePeriod=30 Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.671360 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cdb138bd-7439-4632-b25b-457048de3859" containerName="glance-log" containerID="cri-o://9dec420108e23a1a14970e93f6d13ee1030777c92ec41ab914b92723a7cbae06" gracePeriod=30 Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.671414 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cdb138bd-7439-4632-b25b-457048de3859","Type":"ContainerStarted","Data":"37946b4189799efe3d3271cff0ee68ab675c915cc91de5a25072e69606cb6d2f"} Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.671470 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7a8a05c1-27f7-446e-b49e-1762d51644a5" containerName="glance-httpd" containerID="cri-o://6402001a0ffaa057bb6d952afb0f3a18f21b73701900e673f04948e10a7fc36e" gracePeriod=30 Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.671541 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cdb138bd-7439-4632-b25b-457048de3859" containerName="glance-httpd" containerID="cri-o://37946b4189799efe3d3271cff0ee68ab675c915cc91de5a25072e69606cb6d2f" gracePeriod=30 Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.691474 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-e87f-account-create-m8mwt" podStartSLOduration=2.691442293 podStartE2EDuration="2.691442293s" podCreationTimestamp="2025-10-07 14:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:46.684208762 +0000 UTC m=+1088.512134574" watchObservedRunningTime="2025-10-07 14:11:46.691442293 +0000 UTC m=+1088.519368085" Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.724600 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-4f33-account-create-rpvq6" podStartSLOduration=2.724581631 podStartE2EDuration="2.724581631s" podCreationTimestamp="2025-10-07 14:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:46.722531964 +0000 UTC m=+1088.550457756" watchObservedRunningTime="2025-10-07 14:11:46.724581631 +0000 UTC m=+1088.552507423" Oct 07 14:11:46 crc kubenswrapper[4717]: I1007 14:11:46.772379 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.772358874 podStartE2EDuration="5.772358874s" podCreationTimestamp="2025-10-07 14:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:11:46.754268883 +0000 UTC m=+1088.582194675" watchObservedRunningTime="2025-10-07 14:11:46.772358874 +0000 UTC m=+1088.600284666" Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.686941 4717 generic.go:334] "Generic (PLEG): container finished" podID="cdb138bd-7439-4632-b25b-457048de3859" containerID="37946b4189799efe3d3271cff0ee68ab675c915cc91de5a25072e69606cb6d2f" exitCode=0 Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.687445 4717 generic.go:334] "Generic (PLEG): container finished" podID="cdb138bd-7439-4632-b25b-457048de3859" containerID="9dec420108e23a1a14970e93f6d13ee1030777c92ec41ab914b92723a7cbae06" exitCode=143 Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.687034 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cdb138bd-7439-4632-b25b-457048de3859","Type":"ContainerDied","Data":"37946b4189799efe3d3271cff0ee68ab675c915cc91de5a25072e69606cb6d2f"} Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.687513 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cdb138bd-7439-4632-b25b-457048de3859","Type":"ContainerDied","Data":"9dec420108e23a1a14970e93f6d13ee1030777c92ec41ab914b92723a7cbae06"} Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.689257 4717 generic.go:334] "Generic (PLEG): container finished" podID="1189ffe2-55f3-4da5-9cf5-2a871828cfa3" containerID="0a8050a8cd4d044ad4d231288b090a4cfc07e20d794bdee8a68820d57518ba75" exitCode=0 Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.689299 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4f33-account-create-rpvq6" event={"ID":"1189ffe2-55f3-4da5-9cf5-2a871828cfa3","Type":"ContainerDied","Data":"0a8050a8cd4d044ad4d231288b090a4cfc07e20d794bdee8a68820d57518ba75"} Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.693776 4717 generic.go:334] "Generic (PLEG): container finished" podID="0bda80bb-339d-4c40-a8cf-91dc994bcc15" containerID="802349aec5f093427e75c76200160d57a5a7989e456b9f437e17268d078c350a" exitCode=0 Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.693889 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e87f-account-create-m8mwt" event={"ID":"0bda80bb-339d-4c40-a8cf-91dc994bcc15","Type":"ContainerDied","Data":"802349aec5f093427e75c76200160d57a5a7989e456b9f437e17268d078c350a"} Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.697827 4717 generic.go:334] "Generic (PLEG): container finished" podID="9dc43ea6-a0ae-48a5-9c55-4b932864ea43" containerID="8094c3c8c4396c5d8ee7febbf5a60ef36df9e85e71eefeefacc7b025a4c3860d" exitCode=0 Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.697895 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8cd3-account-create-h8gpd" event={"ID":"9dc43ea6-a0ae-48a5-9c55-4b932864ea43","Type":"ContainerDied","Data":"8094c3c8c4396c5d8ee7febbf5a60ef36df9e85e71eefeefacc7b025a4c3860d"} Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.709256 4717 generic.go:334] "Generic (PLEG): container finished" podID="7a8a05c1-27f7-446e-b49e-1762d51644a5" containerID="6402001a0ffaa057bb6d952afb0f3a18f21b73701900e673f04948e10a7fc36e" exitCode=0 Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.709294 4717 generic.go:334] "Generic (PLEG): container finished" podID="7a8a05c1-27f7-446e-b49e-1762d51644a5" containerID="10bdfcd3afba85494b2aac3c5e93d1e60b06abbc51bb3018f176f583a653c185" exitCode=143 Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.709464 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a8a05c1-27f7-446e-b49e-1762d51644a5","Type":"ContainerDied","Data":"6402001a0ffaa057bb6d952afb0f3a18f21b73701900e673f04948e10a7fc36e"} Oct 07 14:11:47 crc kubenswrapper[4717]: I1007 14:11:47.709490 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a8a05c1-27f7-446e-b49e-1762d51644a5","Type":"ContainerDied","Data":"10bdfcd3afba85494b2aac3c5e93d1e60b06abbc51bb3018f176f583a653c185"} Oct 07 14:11:49 crc kubenswrapper[4717]: I1007 14:11:49.989950 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8cd3-account-create-h8gpd" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.074889 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9gzz\" (UniqueName: \"kubernetes.io/projected/9dc43ea6-a0ae-48a5-9c55-4b932864ea43-kube-api-access-s9gzz\") pod \"9dc43ea6-a0ae-48a5-9c55-4b932864ea43\" (UID: \"9dc43ea6-a0ae-48a5-9c55-4b932864ea43\") " Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.080300 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc43ea6-a0ae-48a5-9c55-4b932864ea43-kube-api-access-s9gzz" (OuterVolumeSpecName: "kube-api-access-s9gzz") pod "9dc43ea6-a0ae-48a5-9c55-4b932864ea43" (UID: "9dc43ea6-a0ae-48a5-9c55-4b932864ea43"). InnerVolumeSpecName "kube-api-access-s9gzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.177223 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9gzz\" (UniqueName: \"kubernetes.io/projected/9dc43ea6-a0ae-48a5-9c55-4b932864ea43-kube-api-access-s9gzz\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.406039 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.582537 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sp9q\" (UniqueName: \"kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-kube-api-access-7sp9q\") pod \"7a8a05c1-27f7-446e-b49e-1762d51644a5\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.582581 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-ceph\") pod \"7a8a05c1-27f7-446e-b49e-1762d51644a5\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.582611 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-combined-ca-bundle\") pod \"7a8a05c1-27f7-446e-b49e-1762d51644a5\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.582654 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-scripts\") pod \"7a8a05c1-27f7-446e-b49e-1762d51644a5\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.582698 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-config-data\") pod \"7a8a05c1-27f7-446e-b49e-1762d51644a5\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.582929 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-logs\") pod \"7a8a05c1-27f7-446e-b49e-1762d51644a5\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.582953 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-httpd-run\") pod \"7a8a05c1-27f7-446e-b49e-1762d51644a5\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.583024 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"7a8a05c1-27f7-446e-b49e-1762d51644a5\" (UID: \"7a8a05c1-27f7-446e-b49e-1762d51644a5\") " Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.584275 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-logs" (OuterVolumeSpecName: "logs") pod "7a8a05c1-27f7-446e-b49e-1762d51644a5" (UID: "7a8a05c1-27f7-446e-b49e-1762d51644a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.584281 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a8a05c1-27f7-446e-b49e-1762d51644a5" (UID: "7a8a05c1-27f7-446e-b49e-1762d51644a5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.587683 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-scripts" (OuterVolumeSpecName: "scripts") pod "7a8a05c1-27f7-446e-b49e-1762d51644a5" (UID: "7a8a05c1-27f7-446e-b49e-1762d51644a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.587724 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-ceph" (OuterVolumeSpecName: "ceph") pod "7a8a05c1-27f7-446e-b49e-1762d51644a5" (UID: "7a8a05c1-27f7-446e-b49e-1762d51644a5"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.587857 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-kube-api-access-7sp9q" (OuterVolumeSpecName: "kube-api-access-7sp9q") pod "7a8a05c1-27f7-446e-b49e-1762d51644a5" (UID: "7a8a05c1-27f7-446e-b49e-1762d51644a5"). InnerVolumeSpecName "kube-api-access-7sp9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.588185 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "7a8a05c1-27f7-446e-b49e-1762d51644a5" (UID: "7a8a05c1-27f7-446e-b49e-1762d51644a5"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.607659 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a8a05c1-27f7-446e-b49e-1762d51644a5" (UID: "7a8a05c1-27f7-446e-b49e-1762d51644a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.633221 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-config-data" (OuterVolumeSpecName: "config-data") pod "7a8a05c1-27f7-446e-b49e-1762d51644a5" (UID: "7a8a05c1-27f7-446e-b49e-1762d51644a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.685482 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.685519 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a8a05c1-27f7-446e-b49e-1762d51644a5-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.685550 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.685559 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sp9q\" (UniqueName: \"kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-kube-api-access-7sp9q\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.685570 4717 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7a8a05c1-27f7-446e-b49e-1762d51644a5-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.685579 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.685588 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.685596 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a8a05c1-27f7-446e-b49e-1762d51644a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.702511 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.734388 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8cd3-account-create-h8gpd" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.734398 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8cd3-account-create-h8gpd" event={"ID":"9dc43ea6-a0ae-48a5-9c55-4b932864ea43","Type":"ContainerDied","Data":"4519235d173db255d1eee939782bb1582086bd803310d25f344df600054a8deb"} Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.734431 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4519235d173db255d1eee939782bb1582086bd803310d25f344df600054a8deb" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.736645 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.736629 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7a8a05c1-27f7-446e-b49e-1762d51644a5","Type":"ContainerDied","Data":"e0b8da7378bfb191e91dc012c1a3d72e02d25a5f9c46da932b23297f30abd98a"} Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.736723 4717 scope.go:117] "RemoveContainer" containerID="6402001a0ffaa057bb6d952afb0f3a18f21b73701900e673f04948e10a7fc36e" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.777420 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.784238 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.786873 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.804712 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:11:50 crc kubenswrapper[4717]: E1007 14:11:50.805159 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8a05c1-27f7-446e-b49e-1762d51644a5" containerName="glance-log" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.805175 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8a05c1-27f7-446e-b49e-1762d51644a5" containerName="glance-log" Oct 07 14:11:50 crc kubenswrapper[4717]: E1007 14:11:50.805187 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a8a05c1-27f7-446e-b49e-1762d51644a5" containerName="glance-httpd" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.805194 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a8a05c1-27f7-446e-b49e-1762d51644a5" containerName="glance-httpd" Oct 07 14:11:50 crc kubenswrapper[4717]: E1007 14:11:50.805202 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc43ea6-a0ae-48a5-9c55-4b932864ea43" containerName="mariadb-account-create" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.805215 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc43ea6-a0ae-48a5-9c55-4b932864ea43" containerName="mariadb-account-create" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.805382 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc43ea6-a0ae-48a5-9c55-4b932864ea43" containerName="mariadb-account-create" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.805397 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8a05c1-27f7-446e-b49e-1762d51644a5" containerName="glance-httpd" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.805412 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a8a05c1-27f7-446e-b49e-1762d51644a5" containerName="glance-log" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.806309 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.810609 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.813140 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.892190 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h6cf\" (UniqueName: \"kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-kube-api-access-2h6cf\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.892271 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-scripts\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.892360 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.892444 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.892541 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-ceph\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.892573 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-config-data\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.892636 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.892687 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-logs\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.907843 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a8a05c1-27f7-446e-b49e-1762d51644a5" path="/var/lib/kubelet/pods/7a8a05c1-27f7-446e-b49e-1762d51644a5/volumes" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.994422 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h6cf\" (UniqueName: \"kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-kube-api-access-2h6cf\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.994738 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-scripts\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.994762 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.995332 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.995391 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-ceph\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.995411 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-config-data\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.995440 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.995476 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-logs\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.995747 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.995942 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 07 14:11:50 crc kubenswrapper[4717]: I1007 14:11:50.996332 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-logs\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:51 crc kubenswrapper[4717]: I1007 14:11:51.013041 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:51 crc kubenswrapper[4717]: I1007 14:11:51.013263 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-config-data\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:51 crc kubenswrapper[4717]: I1007 14:11:51.013694 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-ceph\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:51 crc kubenswrapper[4717]: I1007 14:11:51.013787 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-scripts\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:51 crc kubenswrapper[4717]: I1007 14:11:51.017146 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h6cf\" (UniqueName: \"kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-kube-api-access-2h6cf\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:51 crc kubenswrapper[4717]: I1007 14:11:51.037070 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " pod="openstack/glance-default-external-api-0" Oct 07 14:11:51 crc kubenswrapper[4717]: I1007 14:11:51.173947 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:11:51 crc kubenswrapper[4717]: I1007 14:11:51.538256 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:11:51 crc kubenswrapper[4717]: I1007 14:11:51.635761 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-jm8l7"] Oct 07 14:11:51 crc kubenswrapper[4717]: I1007 14:11:51.636069 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" podUID="c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" containerName="dnsmasq-dns" containerID="cri-o://d7ac35ae2c449f8d7113e2effe4db52ef5f451dd3b526099b9b6856e055ad678" gracePeriod=10 Oct 07 14:11:52 crc kubenswrapper[4717]: I1007 14:11:52.771790 4717 generic.go:334] "Generic (PLEG): container finished" podID="c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" containerID="d7ac35ae2c449f8d7113e2effe4db52ef5f451dd3b526099b9b6856e055ad678" exitCode=0 Oct 07 14:11:52 crc kubenswrapper[4717]: I1007 14:11:52.771869 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" event={"ID":"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d","Type":"ContainerDied","Data":"d7ac35ae2c449f8d7113e2effe4db52ef5f451dd3b526099b9b6856e055ad678"} Oct 07 14:11:53 crc kubenswrapper[4717]: I1007 14:11:53.754096 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.103548 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b867bb977-pb7h8"] Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.135350 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b6cc95fd6-f8jf5"] Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.136805 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.140380 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.147641 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b6cc95fd6-f8jf5"] Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.227684 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-766954cfd9-kkf7x"] Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.268926 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-combined-ca-bundle\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.269030 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-scripts\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.269057 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737597e8-3ae8-4847-b38f-99644001bd0e-logs\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.269092 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-secret-key\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.269111 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srjrx\" (UniqueName: \"kubernetes.io/projected/737597e8-3ae8-4847-b38f-99644001bd0e-kube-api-access-srjrx\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.269141 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-tls-certs\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.269201 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-config-data\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.270470 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-ff6468b6d-j9vqb"] Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.271913 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.299045 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-ff6468b6d-j9vqb"] Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.370985 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fabca24d-42a9-45e6-81ca-ad04bf8bd588-scripts\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371050 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fabca24d-42a9-45e6-81ca-ad04bf8bd588-logs\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371113 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-scripts\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371134 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737597e8-3ae8-4847-b38f-99644001bd0e-logs\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371167 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fabca24d-42a9-45e6-81ca-ad04bf8bd588-horizon-secret-key\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371185 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fabca24d-42a9-45e6-81ca-ad04bf8bd588-combined-ca-bundle\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371203 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-secret-key\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371222 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srjrx\" (UniqueName: \"kubernetes.io/projected/737597e8-3ae8-4847-b38f-99644001bd0e-kube-api-access-srjrx\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371255 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-tls-certs\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371277 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fabca24d-42a9-45e6-81ca-ad04bf8bd588-horizon-tls-certs\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371319 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fabca24d-42a9-45e6-81ca-ad04bf8bd588-config-data\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371370 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-config-data\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371402 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-combined-ca-bundle\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.371427 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jm76\" (UniqueName: \"kubernetes.io/projected/fabca24d-42a9-45e6-81ca-ad04bf8bd588-kube-api-access-4jm76\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.372177 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-scripts\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.372340 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737597e8-3ae8-4847-b38f-99644001bd0e-logs\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.373640 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-config-data\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.391675 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-secret-key\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.392571 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-tls-certs\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.397788 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-combined-ca-bundle\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.402539 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srjrx\" (UniqueName: \"kubernetes.io/projected/737597e8-3ae8-4847-b38f-99644001bd0e-kube-api-access-srjrx\") pod \"horizon-5b6cc95fd6-f8jf5\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.472756 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fabca24d-42a9-45e6-81ca-ad04bf8bd588-horizon-tls-certs\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.472845 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fabca24d-42a9-45e6-81ca-ad04bf8bd588-config-data\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.472915 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jm76\" (UniqueName: \"kubernetes.io/projected/fabca24d-42a9-45e6-81ca-ad04bf8bd588-kube-api-access-4jm76\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.472955 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fabca24d-42a9-45e6-81ca-ad04bf8bd588-scripts\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.472976 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fabca24d-42a9-45e6-81ca-ad04bf8bd588-logs\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.473049 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fabca24d-42a9-45e6-81ca-ad04bf8bd588-horizon-secret-key\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.473071 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fabca24d-42a9-45e6-81ca-ad04bf8bd588-combined-ca-bundle\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.473094 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.477222 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fabca24d-42a9-45e6-81ca-ad04bf8bd588-config-data\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.477688 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fabca24d-42a9-45e6-81ca-ad04bf8bd588-logs\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.477932 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fabca24d-42a9-45e6-81ca-ad04bf8bd588-scripts\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.488212 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fabca24d-42a9-45e6-81ca-ad04bf8bd588-combined-ca-bundle\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.491593 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fabca24d-42a9-45e6-81ca-ad04bf8bd588-horizon-tls-certs\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.510576 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jm76\" (UniqueName: \"kubernetes.io/projected/fabca24d-42a9-45e6-81ca-ad04bf8bd588-kube-api-access-4jm76\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.512676 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fabca24d-42a9-45e6-81ca-ad04bf8bd588-horizon-secret-key\") pod \"horizon-ff6468b6d-j9vqb\" (UID: \"fabca24d-42a9-45e6-81ca-ad04bf8bd588\") " pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.598755 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.820685 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-tjhjz"] Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.822048 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.825025 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.825157 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.825201 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zkfpb" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.831481 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tjhjz"] Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.982440 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-db-sync-config-data\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.982528 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dd33a1d-9592-465a-8285-941a03e92fa4-etc-machine-id\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.982603 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-combined-ca-bundle\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.982725 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlgc8\" (UniqueName: \"kubernetes.io/projected/8dd33a1d-9592-465a-8285-941a03e92fa4-kube-api-access-tlgc8\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.982832 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-scripts\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:54 crc kubenswrapper[4717]: I1007 14:11:54.982850 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-config-data\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.084999 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-combined-ca-bundle\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.085148 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlgc8\" (UniqueName: \"kubernetes.io/projected/8dd33a1d-9592-465a-8285-941a03e92fa4-kube-api-access-tlgc8\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.085187 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-scripts\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.085202 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-config-data\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.085255 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-db-sync-config-data\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.085284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dd33a1d-9592-465a-8285-941a03e92fa4-etc-machine-id\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.085398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dd33a1d-9592-465a-8285-941a03e92fa4-etc-machine-id\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.090206 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-config-data\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.090623 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-db-sync-config-data\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.095674 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-scripts\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.096188 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-combined-ca-bundle\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.104935 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlgc8\" (UniqueName: \"kubernetes.io/projected/8dd33a1d-9592-465a-8285-941a03e92fa4-kube-api-access-tlgc8\") pod \"cinder-db-sync-tjhjz\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:55 crc kubenswrapper[4717]: I1007 14:11:55.146447 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:11:56 crc kubenswrapper[4717]: I1007 14:11:56.596975 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" podUID="c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Oct 07 14:11:57 crc kubenswrapper[4717]: E1007 14:11:57.967689 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 07 14:11:57 crc kubenswrapper[4717]: E1007 14:11:57.968129 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5dfh668h5cbh57dh5c4h676h578hdh54h547hbchfch665h648hf9h66hbdh58fhdch697h8chdch5d8h56bh688h695h594h9h5bdhf8h56ch5bcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nwppp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7b867bb977-pb7h8_openstack(362407be-b43c-4a74-8b08-d22518d2b6b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 14:11:57 crc kubenswrapper[4717]: E1007 14:11:57.977483 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7b867bb977-pb7h8" podUID="362407be-b43c-4a74-8b08-d22518d2b6b4" Oct 07 14:11:58 crc kubenswrapper[4717]: E1007 14:11:58.002231 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Oct 07 14:11:58 crc kubenswrapper[4717]: E1007 14:11:58.002370 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67chc4h5c6hbh67dhcdh64h579hcdh66bh54h546h687h557h558hd4h57h8ch8dh694h5b9h575hb7hcch68dh7ch5dh56h68chd7hddh56bq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jbdw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-747d686857-svzsx_openstack(6585c6e8-1873-4e2a-ba9c-9590fc162dcb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 14:11:58 crc kubenswrapper[4717]: E1007 14:11:58.004372 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-747d686857-svzsx" podUID="6585c6e8-1873-4e2a-ba9c-9590fc162dcb" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.104414 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.250546 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-config-data\") pod \"cdb138bd-7439-4632-b25b-457048de3859\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.250629 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-combined-ca-bundle\") pod \"cdb138bd-7439-4632-b25b-457048de3859\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.250656 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-httpd-run\") pod \"cdb138bd-7439-4632-b25b-457048de3859\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.250675 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-scripts\") pod \"cdb138bd-7439-4632-b25b-457048de3859\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.250733 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d87vv\" (UniqueName: \"kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-kube-api-access-d87vv\") pod \"cdb138bd-7439-4632-b25b-457048de3859\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.250843 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-ceph\") pod \"cdb138bd-7439-4632-b25b-457048de3859\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.250865 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"cdb138bd-7439-4632-b25b-457048de3859\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.250912 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-logs\") pod \"cdb138bd-7439-4632-b25b-457048de3859\" (UID: \"cdb138bd-7439-4632-b25b-457048de3859\") " Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.251701 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-logs" (OuterVolumeSpecName: "logs") pod "cdb138bd-7439-4632-b25b-457048de3859" (UID: "cdb138bd-7439-4632-b25b-457048de3859"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.253073 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cdb138bd-7439-4632-b25b-457048de3859" (UID: "cdb138bd-7439-4632-b25b-457048de3859"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.278204 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-kube-api-access-d87vv" (OuterVolumeSpecName: "kube-api-access-d87vv") pod "cdb138bd-7439-4632-b25b-457048de3859" (UID: "cdb138bd-7439-4632-b25b-457048de3859"). InnerVolumeSpecName "kube-api-access-d87vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.281160 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "cdb138bd-7439-4632-b25b-457048de3859" (UID: "cdb138bd-7439-4632-b25b-457048de3859"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.287255 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-scripts" (OuterVolumeSpecName: "scripts") pod "cdb138bd-7439-4632-b25b-457048de3859" (UID: "cdb138bd-7439-4632-b25b-457048de3859"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.289862 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-ceph" (OuterVolumeSpecName: "ceph") pod "cdb138bd-7439-4632-b25b-457048de3859" (UID: "cdb138bd-7439-4632-b25b-457048de3859"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.291179 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdb138bd-7439-4632-b25b-457048de3859" (UID: "cdb138bd-7439-4632-b25b-457048de3859"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.311070 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-config-data" (OuterVolumeSpecName: "config-data") pod "cdb138bd-7439-4632-b25b-457048de3859" (UID: "cdb138bd-7439-4632-b25b-457048de3859"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.353455 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.353494 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.353507 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.353518 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d87vv\" (UniqueName: \"kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-kube-api-access-d87vv\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.353532 4717 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdb138bd-7439-4632-b25b-457048de3859-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.353572 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.353598 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdb138bd-7439-4632-b25b-457048de3859-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.353610 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb138bd-7439-4632-b25b-457048de3859-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.378223 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.455235 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.852335 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cdb138bd-7439-4632-b25b-457048de3859","Type":"ContainerDied","Data":"c80129167453242bad9d500814169c954e888fae1b9501ae4b88adcea2ed6eb8"} Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.852515 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.946064 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.984355 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:11:58 crc kubenswrapper[4717]: I1007 14:11:58.998800 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:11:59 crc kubenswrapper[4717]: E1007 14:11:58.999719 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb138bd-7439-4632-b25b-457048de3859" containerName="glance-httpd" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:58.999738 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb138bd-7439-4632-b25b-457048de3859" containerName="glance-httpd" Oct 07 14:11:59 crc kubenswrapper[4717]: E1007 14:11:58.999770 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb138bd-7439-4632-b25b-457048de3859" containerName="glance-log" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:58.999776 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb138bd-7439-4632-b25b-457048de3859" containerName="glance-log" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.000099 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb138bd-7439-4632-b25b-457048de3859" containerName="glance-log" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.000124 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb138bd-7439-4632-b25b-457048de3859" containerName="glance-httpd" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.002496 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.007111 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.007607 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.055146 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.073403 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.073472 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.073491 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.073536 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.073567 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.073592 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.073620 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.073682 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2zzh\" (UniqueName: \"kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-kube-api-access-m2zzh\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.073701 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-logs\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.176116 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2zzh\" (UniqueName: \"kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-kube-api-access-m2zzh\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.176164 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-logs\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.176193 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.176232 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.176258 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.176305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.176335 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.176360 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.176396 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.176758 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.177923 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-logs\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.178338 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.182864 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.182888 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-ceph\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.183817 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.186690 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.202623 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.203718 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2zzh\" (UniqueName: \"kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-kube-api-access-m2zzh\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.233918 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: I1007 14:11:59.343355 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:11:59 crc kubenswrapper[4717]: E1007 14:11:59.938109 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 07 14:11:59 crc kubenswrapper[4717]: E1007 14:11:59.938710 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qfjjl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-qmft2_openstack(eb1a5ed9-e123-447d-a56d-e0cce35eb56a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 14:11:59 crc kubenswrapper[4717]: E1007 14:11:59.940608 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-qmft2" podUID="eb1a5ed9-e123-447d-a56d-e0cce35eb56a" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.084515 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.099911 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.110045 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-747d686857-svzsx" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.124627 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7b91-account-create-5dp6f" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.168692 4717 scope.go:117] "RemoveContainer" containerID="10bdfcd3afba85494b2aac3c5e93d1e60b06abbc51bb3018f176f583a653c185" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.171430 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e87f-account-create-m8mwt" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.171960 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4f33-account-create-rpvq6" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.200850 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwppp\" (UniqueName: \"kubernetes.io/projected/362407be-b43c-4a74-8b08-d22518d2b6b4-kube-api-access-nwppp\") pod \"362407be-b43c-4a74-8b08-d22518d2b6b4\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201063 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbdw7\" (UniqueName: \"kubernetes.io/projected/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-kube-api-access-jbdw7\") pod \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201284 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/362407be-b43c-4a74-8b08-d22518d2b6b4-horizon-secret-key\") pod \"362407be-b43c-4a74-8b08-d22518d2b6b4\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201436 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-scripts\") pod \"362407be-b43c-4a74-8b08-d22518d2b6b4\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201465 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-scripts\") pod \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201504 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-horizon-secret-key\") pod \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201535 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/362407be-b43c-4a74-8b08-d22518d2b6b4-logs\") pod \"362407be-b43c-4a74-8b08-d22518d2b6b4\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201553 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-scripts\") pod \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201616 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-config-data\") pod \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201695 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-config-data\") pod \"362407be-b43c-4a74-8b08-d22518d2b6b4\" (UID: \"362407be-b43c-4a74-8b08-d22518d2b6b4\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201725 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-fernet-keys\") pod \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201787 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-credential-keys\") pod \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201874 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96c6d\" (UniqueName: \"kubernetes.io/projected/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-kube-api-access-96c6d\") pod \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201898 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-config-data\") pod \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201957 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b49c7\" (UniqueName: \"kubernetes.io/projected/ce8237ed-de9c-4c53-a716-de316c22554b-kube-api-access-b49c7\") pod \"ce8237ed-de9c-4c53-a716-de316c22554b\" (UID: \"ce8237ed-de9c-4c53-a716-de316c22554b\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.201979 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-combined-ca-bundle\") pod \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\" (UID: \"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.202036 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-logs\") pod \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\" (UID: \"6585c6e8-1873-4e2a-ba9c-9590fc162dcb\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.202853 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-logs" (OuterVolumeSpecName: "logs") pod "6585c6e8-1873-4e2a-ba9c-9590fc162dcb" (UID: "6585c6e8-1873-4e2a-ba9c-9590fc162dcb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.203178 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-scripts" (OuterVolumeSpecName: "scripts") pod "362407be-b43c-4a74-8b08-d22518d2b6b4" (UID: "362407be-b43c-4a74-8b08-d22518d2b6b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.203377 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-config-data" (OuterVolumeSpecName: "config-data") pod "6585c6e8-1873-4e2a-ba9c-9590fc162dcb" (UID: "6585c6e8-1873-4e2a-ba9c-9590fc162dcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.204078 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-config-data" (OuterVolumeSpecName: "config-data") pod "362407be-b43c-4a74-8b08-d22518d2b6b4" (UID: "362407be-b43c-4a74-8b08-d22518d2b6b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.210706 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" (UID: "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.211053 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/362407be-b43c-4a74-8b08-d22518d2b6b4-logs" (OuterVolumeSpecName: "logs") pod "362407be-b43c-4a74-8b08-d22518d2b6b4" (UID: "362407be-b43c-4a74-8b08-d22518d2b6b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.214348 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-scripts" (OuterVolumeSpecName: "scripts") pod "6585c6e8-1873-4e2a-ba9c-9590fc162dcb" (UID: "6585c6e8-1873-4e2a-ba9c-9590fc162dcb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.216569 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-kube-api-access-jbdw7" (OuterVolumeSpecName: "kube-api-access-jbdw7") pod "6585c6e8-1873-4e2a-ba9c-9590fc162dcb" (UID: "6585c6e8-1873-4e2a-ba9c-9590fc162dcb"). InnerVolumeSpecName "kube-api-access-jbdw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.217144 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6585c6e8-1873-4e2a-ba9c-9590fc162dcb" (UID: "6585c6e8-1873-4e2a-ba9c-9590fc162dcb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.217668 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" (UID: "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.217861 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-scripts" (OuterVolumeSpecName: "scripts") pod "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" (UID: "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.217932 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-kube-api-access-96c6d" (OuterVolumeSpecName: "kube-api-access-96c6d") pod "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" (UID: "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655"). InnerVolumeSpecName "kube-api-access-96c6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.217866 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce8237ed-de9c-4c53-a716-de316c22554b-kube-api-access-b49c7" (OuterVolumeSpecName: "kube-api-access-b49c7") pod "ce8237ed-de9c-4c53-a716-de316c22554b" (UID: "ce8237ed-de9c-4c53-a716-de316c22554b"). InnerVolumeSpecName "kube-api-access-b49c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.220516 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362407be-b43c-4a74-8b08-d22518d2b6b4-kube-api-access-nwppp" (OuterVolumeSpecName: "kube-api-access-nwppp") pod "362407be-b43c-4a74-8b08-d22518d2b6b4" (UID: "362407be-b43c-4a74-8b08-d22518d2b6b4"). InnerVolumeSpecName "kube-api-access-nwppp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.221075 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/362407be-b43c-4a74-8b08-d22518d2b6b4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "362407be-b43c-4a74-8b08-d22518d2b6b4" (UID: "362407be-b43c-4a74-8b08-d22518d2b6b4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.238077 4717 scope.go:117] "RemoveContainer" containerID="37946b4189799efe3d3271cff0ee68ab675c915cc91de5a25072e69606cb6d2f" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.242100 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-config-data" (OuterVolumeSpecName: "config-data") pod "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" (UID: "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.260219 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" (UID: "2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.307999 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sb26\" (UniqueName: \"kubernetes.io/projected/0bda80bb-339d-4c40-a8cf-91dc994bcc15-kube-api-access-8sb26\") pod \"0bda80bb-339d-4c40-a8cf-91dc994bcc15\" (UID: \"0bda80bb-339d-4c40-a8cf-91dc994bcc15\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.308720 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdkzz\" (UniqueName: \"kubernetes.io/projected/1189ffe2-55f3-4da5-9cf5-2a871828cfa3-kube-api-access-sdkzz\") pod \"1189ffe2-55f3-4da5-9cf5-2a871828cfa3\" (UID: \"1189ffe2-55f3-4da5-9cf5-2a871828cfa3\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309234 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96c6d\" (UniqueName: \"kubernetes.io/projected/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-kube-api-access-96c6d\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309253 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309266 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b49c7\" (UniqueName: \"kubernetes.io/projected/ce8237ed-de9c-4c53-a716-de316c22554b-kube-api-access-b49c7\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309275 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309284 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309293 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwppp\" (UniqueName: \"kubernetes.io/projected/362407be-b43c-4a74-8b08-d22518d2b6b4-kube-api-access-nwppp\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309302 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbdw7\" (UniqueName: \"kubernetes.io/projected/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-kube-api-access-jbdw7\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309312 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/362407be-b43c-4a74-8b08-d22518d2b6b4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309320 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309331 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309339 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309349 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/362407be-b43c-4a74-8b08-d22518d2b6b4-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309356 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309364 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6585c6e8-1873-4e2a-ba9c-9590fc162dcb-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309372 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/362407be-b43c-4a74-8b08-d22518d2b6b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309380 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.309388 4717 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.316421 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1189ffe2-55f3-4da5-9cf5-2a871828cfa3-kube-api-access-sdkzz" (OuterVolumeSpecName: "kube-api-access-sdkzz") pod "1189ffe2-55f3-4da5-9cf5-2a871828cfa3" (UID: "1189ffe2-55f3-4da5-9cf5-2a871828cfa3"). InnerVolumeSpecName "kube-api-access-sdkzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.320702 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bda80bb-339d-4c40-a8cf-91dc994bcc15-kube-api-access-8sb26" (OuterVolumeSpecName: "kube-api-access-8sb26") pod "0bda80bb-339d-4c40-a8cf-91dc994bcc15" (UID: "0bda80bb-339d-4c40-a8cf-91dc994bcc15"). InnerVolumeSpecName "kube-api-access-8sb26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.410400 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdkzz\" (UniqueName: \"kubernetes.io/projected/1189ffe2-55f3-4da5-9cf5-2a871828cfa3-kube-api-access-sdkzz\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.410428 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sb26\" (UniqueName: \"kubernetes.io/projected/0bda80bb-339d-4c40-a8cf-91dc994bcc15-kube-api-access-8sb26\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.482283 4717 scope.go:117] "RemoveContainer" containerID="9dec420108e23a1a14970e93f6d13ee1030777c92ec41ab914b92723a7cbae06" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.615391 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.701937 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b6cc95fd6-f8jf5"] Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.715810 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p88sw\" (UniqueName: \"kubernetes.io/projected/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-kube-api-access-p88sw\") pod \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.715909 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-nb\") pod \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.715966 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-svc\") pod \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.716083 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-sb\") pod \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.716137 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-swift-storage-0\") pod \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.716194 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-config\") pod \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\" (UID: \"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d\") " Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.723168 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-kube-api-access-p88sw" (OuterVolumeSpecName: "kube-api-access-p88sw") pod "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" (UID: "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d"). InnerVolumeSpecName "kube-api-access-p88sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.776392 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" (UID: "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.783403 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" (UID: "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.793124 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" (UID: "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.801160 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-config" (OuterVolumeSpecName: "config") pod "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" (UID: "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.816215 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tjhjz"] Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.818200 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" (UID: "c4c9e5fe-375d-49f4-b8a4-0fa91a12770d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.821381 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p88sw\" (UniqueName: \"kubernetes.io/projected/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-kube-api-access-p88sw\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.821420 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.821433 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.821443 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.821453 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.821843 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.828514 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-ff6468b6d-j9vqb"] Oct 07 14:12:00 crc kubenswrapper[4717]: W1007 14:12:00.851607 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfabca24d_42a9_45e6_81ca_ad04bf8bd588.slice/crio-5d0c51af53297b019bd4362efe114d9d8d5158354d008701c76c2f7fed494b3f WatchSource:0}: Error finding container 5d0c51af53297b019bd4362efe114d9d8d5158354d008701c76c2f7fed494b3f: Status 404 returned error can't find the container with id 5d0c51af53297b019bd4362efe114d9d8d5158354d008701c76c2f7fed494b3f Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.876082 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e87f-account-create-m8mwt" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.883859 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb138bd-7439-4632-b25b-457048de3859" path="/var/lib/kubelet/pods/cdb138bd-7439-4632-b25b-457048de3859/volumes" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.885676 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e87f-account-create-m8mwt" event={"ID":"0bda80bb-339d-4c40-a8cf-91dc994bcc15","Type":"ContainerDied","Data":"7f25908bf46f5502a713bc44197c5f4fd36dea8b55f64b8201b82099d03003ec"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.885704 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f25908bf46f5502a713bc44197c5f4fd36dea8b55f64b8201b82099d03003ec" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.889571 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qtb44" event={"ID":"2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655","Type":"ContainerDied","Data":"045680ecd573881c4c5177330c03ea89e50a83dda0ac0671addf86b8d6e5843f"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.889626 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="045680ecd573881c4c5177330c03ea89e50a83dda0ac0671addf86b8d6e5843f" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.889715 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qtb44" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.901217 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b867bb977-pb7h8" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.901220 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b867bb977-pb7h8" event={"ID":"362407be-b43c-4a74-8b08-d22518d2b6b4","Type":"ContainerDied","Data":"0357e296d3c91b2f3a8d7ff569c15436f61eb34828820e8496a45f1672420c16"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.902734 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ff6468b6d-j9vqb" event={"ID":"fabca24d-42a9-45e6-81ca-ad04bf8bd588","Type":"ContainerStarted","Data":"5d0c51af53297b019bd4362efe114d9d8d5158354d008701c76c2f7fed494b3f"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.904813 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tjhjz" event={"ID":"8dd33a1d-9592-465a-8285-941a03e92fa4","Type":"ContainerStarted","Data":"227379d46b79e1fba8ee453c98e256bd808c0f197555bcbd1cd5af0f216ecc57"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.905936 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.906144 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-766954cfd9-kkf7x" event={"ID":"8056f9f9-b82b-4309-a225-241d2a7ba680","Type":"ContainerStarted","Data":"020423bd11264a6b25390d927315cc8292362dab4ac1e3794faf30a8bed66485"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.909645 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4f33-account-create-rpvq6" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.909637 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4f33-account-create-rpvq6" event={"ID":"1189ffe2-55f3-4da5-9cf5-2a871828cfa3","Type":"ContainerDied","Data":"81f32758e205f13d86ea734042faba5c674e4d36fbc23de3cca27001fe50c36a"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.909780 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f32758e205f13d86ea734042faba5c674e4d36fbc23de3cca27001fe50c36a" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.912617 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b6cc95fd6-f8jf5" event={"ID":"737597e8-3ae8-4847-b38f-99644001bd0e","Type":"ContainerStarted","Data":"1cb06ec2cd147927144c9470c2c41f876da6c35157a727a648fccc48bd8fa4c1"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.913946 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-7b91-account-create-5dp6f" event={"ID":"ce8237ed-de9c-4c53-a716-de316c22554b","Type":"ContainerDied","Data":"790fc973a46fbf6f45804552cdc26f6b5196fda8a6793b2953284f003f03ec92"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.913974 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="790fc973a46fbf6f45804552cdc26f6b5196fda8a6793b2953284f003f03ec92" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.914070 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-7b91-account-create-5dp6f" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.921290 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" event={"ID":"c4c9e5fe-375d-49f4-b8a4-0fa91a12770d","Type":"ContainerDied","Data":"d959487432f9bb27704b743d38a00a6112ae5080a5ae98e9dd3c3e7b52467cc9"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.921351 4717 scope.go:117] "RemoveContainer" containerID="d7ac35ae2c449f8d7113e2effe4db52ef5f451dd3b526099b9b6856e055ad678" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.921486 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-jm8l7" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.923934 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-747d686857-svzsx" event={"ID":"6585c6e8-1873-4e2a-ba9c-9590fc162dcb","Type":"ContainerDied","Data":"32524a10b6f0bed64626265f7a50864d07c4ab95f222628f38cb7cd7e03dd9dd"} Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.923982 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-747d686857-svzsx" Oct 07 14:12:00 crc kubenswrapper[4717]: I1007 14:12:00.926280 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8f021f-8b1b-4a30-80a4-b01a299c734f","Type":"ContainerStarted","Data":"6e5268975f6e2494483aafa7302966facb46750f60a301e3bc96fc9184b452d3"} Oct 07 14:12:00 crc kubenswrapper[4717]: E1007 14:12:00.929642 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-qmft2" podUID="eb1a5ed9-e123-447d-a56d-e0cce35eb56a" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.055563 4717 scope.go:117] "RemoveContainer" containerID="2cc165a67bfd20132b560f5504b6a1b738d4219d9839e35f4a10266814d616fd" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.089682 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-jm8l7"] Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.102434 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-jm8l7"] Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.108052 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.135855 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b867bb977-pb7h8"] Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.144396 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b867bb977-pb7h8"] Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.208217 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-747d686857-svzsx"] Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.240575 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-747d686857-svzsx"] Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.276849 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qtb44"] Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.298889 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qtb44"] Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.338434 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-z2524"] Oct 07 14:12:01 crc kubenswrapper[4717]: E1007 14:12:01.338961 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bda80bb-339d-4c40-a8cf-91dc994bcc15" containerName="mariadb-account-create" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.338979 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bda80bb-339d-4c40-a8cf-91dc994bcc15" containerName="mariadb-account-create" Oct 07 14:12:01 crc kubenswrapper[4717]: E1007 14:12:01.338990 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce8237ed-de9c-4c53-a716-de316c22554b" containerName="mariadb-account-create" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.338996 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce8237ed-de9c-4c53-a716-de316c22554b" containerName="mariadb-account-create" Oct 07 14:12:01 crc kubenswrapper[4717]: E1007 14:12:01.339034 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" containerName="keystone-bootstrap" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.339043 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" containerName="keystone-bootstrap" Oct 07 14:12:01 crc kubenswrapper[4717]: E1007 14:12:01.339055 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" containerName="init" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.339061 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" containerName="init" Oct 07 14:12:01 crc kubenswrapper[4717]: E1007 14:12:01.339080 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" containerName="dnsmasq-dns" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.339086 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" containerName="dnsmasq-dns" Oct 07 14:12:01 crc kubenswrapper[4717]: E1007 14:12:01.339106 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1189ffe2-55f3-4da5-9cf5-2a871828cfa3" containerName="mariadb-account-create" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.339113 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1189ffe2-55f3-4da5-9cf5-2a871828cfa3" containerName="mariadb-account-create" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.339303 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" containerName="dnsmasq-dns" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.339312 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce8237ed-de9c-4c53-a716-de316c22554b" containerName="mariadb-account-create" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.339325 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" containerName="keystone-bootstrap" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.339337 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bda80bb-339d-4c40-a8cf-91dc994bcc15" containerName="mariadb-account-create" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.339354 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1189ffe2-55f3-4da5-9cf5-2a871828cfa3" containerName="mariadb-account-create" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.340073 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.343251 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.343584 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.343840 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.343845 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m4h7z" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.350343 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-z2524"] Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.436346 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-fernet-keys\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.436403 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5wqw\" (UniqueName: \"kubernetes.io/projected/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-kube-api-access-l5wqw\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.436431 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-config-data\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.436460 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-combined-ca-bundle\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.436582 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-credential-keys\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.436692 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-scripts\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.538235 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-fernet-keys\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.538286 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5wqw\" (UniqueName: \"kubernetes.io/projected/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-kube-api-access-l5wqw\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.538309 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-config-data\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.538328 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-combined-ca-bundle\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.538350 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-credential-keys\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.538377 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-scripts\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.545105 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-fernet-keys\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.545440 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-combined-ca-bundle\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.548868 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-credential-keys\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.549127 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-config-data\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.550736 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-scripts\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.557523 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5wqw\" (UniqueName: \"kubernetes.io/projected/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-kube-api-access-l5wqw\") pod \"keystone-bootstrap-z2524\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.610138 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.610194 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.754367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.975097 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ff6468b6d-j9vqb" event={"ID":"fabca24d-42a9-45e6-81ca-ad04bf8bd588","Type":"ContainerStarted","Data":"93ba5f2942faee8f63fc00cbce7046dcd57412ef1c3f2a1b7643b5a59140ab43"} Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.975921 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-ff6468b6d-j9vqb" event={"ID":"fabca24d-42a9-45e6-81ca-ad04bf8bd588","Type":"ContainerStarted","Data":"68b28b5a087a31c767b387df5cc8f1fbd39061ea9e137aeda3e72afe8cbdef82"} Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.988945 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97ea3951-1b7c-4711-87a5-3e420477d7f7","Type":"ContainerStarted","Data":"8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1"} Oct 07 14:12:01 crc kubenswrapper[4717]: I1007 14:12:01.989138 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97ea3951-1b7c-4711-87a5-3e420477d7f7","Type":"ContainerStarted","Data":"33e3d1605d165cf872c4a9c2afb6ef5800f5887e7cffc483ec6fe21360b1017f"} Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.015563 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-ff6468b6d-j9vqb" podStartSLOduration=8.015538958 podStartE2EDuration="8.015538958s" podCreationTimestamp="2025-10-07 14:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:12:02.001501631 +0000 UTC m=+1103.829427423" watchObservedRunningTime="2025-10-07 14:12:02.015538958 +0000 UTC m=+1103.843464750" Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.027933 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-766954cfd9-kkf7x" event={"ID":"8056f9f9-b82b-4309-a225-241d2a7ba680","Type":"ContainerStarted","Data":"5dea6e2e805cbbff195d88cd91cf7a6015c6ecfb7f01b84d946bb0bda96b0050"} Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.028088 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-766954cfd9-kkf7x" podUID="8056f9f9-b82b-4309-a225-241d2a7ba680" containerName="horizon-log" containerID="cri-o://020423bd11264a6b25390d927315cc8292362dab4ac1e3794faf30a8bed66485" gracePeriod=30 Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.028184 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-766954cfd9-kkf7x" podUID="8056f9f9-b82b-4309-a225-241d2a7ba680" containerName="horizon" containerID="cri-o://5dea6e2e805cbbff195d88cd91cf7a6015c6ecfb7f01b84d946bb0bda96b0050" gracePeriod=30 Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.034942 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"214c763c-d42a-4ac5-bd62-6eab1fa6379f","Type":"ContainerStarted","Data":"9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7"} Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.035036 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"214c763c-d42a-4ac5-bd62-6eab1fa6379f","Type":"ContainerStarted","Data":"2c2e31ac0f9caf17f45837b25cc3986eb668730e54fcf1f122100840c6586a20"} Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.040557 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b6cc95fd6-f8jf5" event={"ID":"737597e8-3ae8-4847-b38f-99644001bd0e","Type":"ContainerStarted","Data":"1be9c8727e96c507158a78f88891832a24bc2136e825aa65da4b1531801f787f"} Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.040624 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b6cc95fd6-f8jf5" event={"ID":"737597e8-3ae8-4847-b38f-99644001bd0e","Type":"ContainerStarted","Data":"de5625b664832e995fc93a9a6085e4a0819c974d6527e6d58b235012ed3d3157"} Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.075599 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-766954cfd9-kkf7x" podStartSLOduration=3.410091006 podStartE2EDuration="17.075562966s" podCreationTimestamp="2025-10-07 14:11:45 +0000 UTC" firstStartedPulling="2025-10-07 14:11:46.52746816 +0000 UTC m=+1088.355393952" lastFinishedPulling="2025-10-07 14:12:00.19294012 +0000 UTC m=+1102.020865912" observedRunningTime="2025-10-07 14:12:02.062059423 +0000 UTC m=+1103.889985235" watchObservedRunningTime="2025-10-07 14:12:02.075562966 +0000 UTC m=+1103.903488758" Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.093852 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5b6cc95fd6-f8jf5" podStartSLOduration=8.093818521 podStartE2EDuration="8.093818521s" podCreationTimestamp="2025-10-07 14:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:12:02.081866221 +0000 UTC m=+1103.909792013" watchObservedRunningTime="2025-10-07 14:12:02.093818521 +0000 UTC m=+1103.921744323" Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.264317 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-z2524"] Oct 07 14:12:02 crc kubenswrapper[4717]: W1007 14:12:02.846203 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde7c8723_1ef0_4d29_b4ba_97b8dc35e7e3.slice/crio-b63739873cfbcaeed2a5d63ca10e4b04bc2d414287e9db4e595ad68e692ce089 WatchSource:0}: Error finding container b63739873cfbcaeed2a5d63ca10e4b04bc2d414287e9db4e595ad68e692ce089: Status 404 returned error can't find the container with id b63739873cfbcaeed2a5d63ca10e4b04bc2d414287e9db4e595ad68e692ce089 Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.883049 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655" path="/var/lib/kubelet/pods/2b4ae6a5-dd6b-46d5-8cca-0f6a7655f655/volumes" Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.887589 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362407be-b43c-4a74-8b08-d22518d2b6b4" path="/var/lib/kubelet/pods/362407be-b43c-4a74-8b08-d22518d2b6b4/volumes" Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.889518 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6585c6e8-1873-4e2a-ba9c-9590fc162dcb" path="/var/lib/kubelet/pods/6585c6e8-1873-4e2a-ba9c-9590fc162dcb/volumes" Oct 07 14:12:02 crc kubenswrapper[4717]: I1007 14:12:02.890323 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4c9e5fe-375d-49f4-b8a4-0fa91a12770d" path="/var/lib/kubelet/pods/c4c9e5fe-375d-49f4-b8a4-0fa91a12770d/volumes" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.066442 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97ea3951-1b7c-4711-87a5-3e420477d7f7","Type":"ContainerStarted","Data":"7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08"} Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.076873 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z2524" event={"ID":"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3","Type":"ContainerStarted","Data":"b63739873cfbcaeed2a5d63ca10e4b04bc2d414287e9db4e595ad68e692ce089"} Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.099730 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"214c763c-d42a-4ac5-bd62-6eab1fa6379f","Type":"ContainerStarted","Data":"d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032"} Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.099723 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" containerName="glance-log" containerID="cri-o://9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7" gracePeriod=30 Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.100140 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" containerName="glance-httpd" containerID="cri-o://d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032" gracePeriod=30 Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.101285 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.101269849 podStartE2EDuration="5.101269849s" podCreationTimestamp="2025-10-07 14:11:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:12:03.100395285 +0000 UTC m=+1104.928321087" watchObservedRunningTime="2025-10-07 14:12:03.101269849 +0000 UTC m=+1104.929195641" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.145682 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.145657825 podStartE2EDuration="13.145657825s" podCreationTimestamp="2025-10-07 14:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:12:03.127341069 +0000 UTC m=+1104.955266871" watchObservedRunningTime="2025-10-07 14:12:03.145657825 +0000 UTC m=+1104.973583617" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.805297 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.895682 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-scripts\") pod \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.895814 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-config-data\") pod \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.895924 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.895982 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-httpd-run\") pod \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.896099 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-logs\") pod \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.896127 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-combined-ca-bundle\") pod \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.896167 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-ceph\") pod \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.896236 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h6cf\" (UniqueName: \"kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-kube-api-access-2h6cf\") pod \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\" (UID: \"214c763c-d42a-4ac5-bd62-6eab1fa6379f\") " Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.896465 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "214c763c-d42a-4ac5-bd62-6eab1fa6379f" (UID: "214c763c-d42a-4ac5-bd62-6eab1fa6379f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.896896 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-logs" (OuterVolumeSpecName: "logs") pod "214c763c-d42a-4ac5-bd62-6eab1fa6379f" (UID: "214c763c-d42a-4ac5-bd62-6eab1fa6379f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.897097 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.897114 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/214c763c-d42a-4ac5-bd62-6eab1fa6379f-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.901655 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "214c763c-d42a-4ac5-bd62-6eab1fa6379f" (UID: "214c763c-d42a-4ac5-bd62-6eab1fa6379f"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.901730 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-scripts" (OuterVolumeSpecName: "scripts") pod "214c763c-d42a-4ac5-bd62-6eab1fa6379f" (UID: "214c763c-d42a-4ac5-bd62-6eab1fa6379f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.901861 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-ceph" (OuterVolumeSpecName: "ceph") pod "214c763c-d42a-4ac5-bd62-6eab1fa6379f" (UID: "214c763c-d42a-4ac5-bd62-6eab1fa6379f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.902389 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-kube-api-access-2h6cf" (OuterVolumeSpecName: "kube-api-access-2h6cf") pod "214c763c-d42a-4ac5-bd62-6eab1fa6379f" (UID: "214c763c-d42a-4ac5-bd62-6eab1fa6379f"). InnerVolumeSpecName "kube-api-access-2h6cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.930535 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "214c763c-d42a-4ac5-bd62-6eab1fa6379f" (UID: "214c763c-d42a-4ac5-bd62-6eab1fa6379f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.946893 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-config-data" (OuterVolumeSpecName: "config-data") pod "214c763c-d42a-4ac5-bd62-6eab1fa6379f" (UID: "214c763c-d42a-4ac5-bd62-6eab1fa6379f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.998445 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.998483 4717 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.998498 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h6cf\" (UniqueName: \"kubernetes.io/projected/214c763c-d42a-4ac5-bd62-6eab1fa6379f-kube-api-access-2h6cf\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.998513 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.998525 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/214c763c-d42a-4ac5-bd62-6eab1fa6379f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:03 crc kubenswrapper[4717]: I1007 14:12:03.998556 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.021801 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.099544 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.111977 4717 generic.go:334] "Generic (PLEG): container finished" podID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" containerID="d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032" exitCode=0 Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.112025 4717 generic.go:334] "Generic (PLEG): container finished" podID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" containerID="9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7" exitCode=143 Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.112042 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"214c763c-d42a-4ac5-bd62-6eab1fa6379f","Type":"ContainerDied","Data":"d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032"} Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.112104 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"214c763c-d42a-4ac5-bd62-6eab1fa6379f","Type":"ContainerDied","Data":"9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7"} Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.112121 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"214c763c-d42a-4ac5-bd62-6eab1fa6379f","Type":"ContainerDied","Data":"2c2e31ac0f9caf17f45837b25cc3986eb668730e54fcf1f122100840c6586a20"} Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.112084 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.112177 4717 scope.go:117] "RemoveContainer" containerID="d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.116789 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z2524" event={"ID":"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3","Type":"ContainerStarted","Data":"b6faa65930282b1c869049b210335050b683b6225fb3aba45b13724f327ae3cb"} Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.136667 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-z2524" podStartSLOduration=3.136648569 podStartE2EDuration="3.136648569s" podCreationTimestamp="2025-10-07 14:12:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:12:04.13307189 +0000 UTC m=+1105.960997702" watchObservedRunningTime="2025-10-07 14:12:04.136648569 +0000 UTC m=+1105.964574361" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.138890 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8f021f-8b1b-4a30-80a4-b01a299c734f","Type":"ContainerStarted","Data":"db185d63ce16d27abd278e7cefeb63a1e6642eaac935f05068f16f941de4bb0d"} Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.156996 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.164824 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.172564 4717 scope.go:117] "RemoveContainer" containerID="9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.174032 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:12:04 crc kubenswrapper[4717]: E1007 14:12:04.174400 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" containerName="glance-log" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.174413 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" containerName="glance-log" Oct 07 14:12:04 crc kubenswrapper[4717]: E1007 14:12:04.174442 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" containerName="glance-httpd" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.174448 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" containerName="glance-httpd" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.174594 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" containerName="glance-log" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.174635 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" containerName="glance-httpd" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.177043 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.184370 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.184662 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.194394 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.202804 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-ceph\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.202848 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-logs\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.202938 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpwxj\" (UniqueName: \"kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-kube-api-access-qpwxj\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.203589 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.203729 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.203808 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.203838 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.203868 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-config-data\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.203919 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-scripts\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.271780 4717 scope.go:117] "RemoveContainer" containerID="d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032" Oct 07 14:12:04 crc kubenswrapper[4717]: E1007 14:12:04.272278 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032\": container with ID starting with d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032 not found: ID does not exist" containerID="d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.272318 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032"} err="failed to get container status \"d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032\": rpc error: code = NotFound desc = could not find container \"d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032\": container with ID starting with d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032 not found: ID does not exist" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.272344 4717 scope.go:117] "RemoveContainer" containerID="9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7" Oct 07 14:12:04 crc kubenswrapper[4717]: E1007 14:12:04.272694 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7\": container with ID starting with 9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7 not found: ID does not exist" containerID="9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.272728 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7"} err="failed to get container status \"9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7\": rpc error: code = NotFound desc = could not find container \"9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7\": container with ID starting with 9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7 not found: ID does not exist" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.272748 4717 scope.go:117] "RemoveContainer" containerID="d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.273224 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032"} err="failed to get container status \"d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032\": rpc error: code = NotFound desc = could not find container \"d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032\": container with ID starting with d39bc8371b2b531d14cb002a584d340cb0e2b6908f0de99ff218ce0d2f2f9032 not found: ID does not exist" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.273255 4717 scope.go:117] "RemoveContainer" containerID="9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.273566 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7"} err="failed to get container status \"9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7\": rpc error: code = NotFound desc = could not find container \"9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7\": container with ID starting with 9f2a1875d451babaed686ce696b9864e295f70dc9a23e08a14c550e374864ff7 not found: ID does not exist" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.306559 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.306650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.306682 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.306706 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-config-data\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.306739 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-scripts\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.306784 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-ceph\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.306806 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-logs\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.306844 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpwxj\" (UniqueName: \"kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-kube-api-access-qpwxj\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.306865 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.309440 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.311418 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-logs\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.311539 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.318795 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.330077 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-ceph\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.330346 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-scripts\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.335645 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.335954 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpwxj\" (UniqueName: \"kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-kube-api-access-qpwxj\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.336080 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-config-data\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.360648 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.474217 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.474268 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.562194 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.601203 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.602848 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.711071 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-fdlxw"] Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.712139 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.718308 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.718339 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-trdg9" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.730431 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-fdlxw"] Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.817451 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-config-data\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.817491 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-job-config-data\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.817516 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-combined-ca-bundle\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.817612 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4mp\" (UniqueName: \"kubernetes.io/projected/1703fd90-e328-4f67-850f-38f8663dd2c2-kube-api-access-qb4mp\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.863479 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gch6l"] Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.865530 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.890516 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-px6bj" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.890666 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.907518 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="214c763c-d42a-4ac5-bd62-6eab1fa6379f" path="/var/lib/kubelet/pods/214c763c-d42a-4ac5-bd62-6eab1fa6379f/volumes" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.908520 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gch6l"] Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.919045 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-db-sync-config-data\") pod \"barbican-db-sync-gch6l\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.919100 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4mp\" (UniqueName: \"kubernetes.io/projected/1703fd90-e328-4f67-850f-38f8663dd2c2-kube-api-access-qb4mp\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.919156 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-combined-ca-bundle\") pod \"barbican-db-sync-gch6l\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.919320 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-295kt\" (UniqueName: \"kubernetes.io/projected/339f6768-36c5-4856-9159-29573ce25fe8-kube-api-access-295kt\") pod \"barbican-db-sync-gch6l\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.919362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-config-data\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.919390 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-job-config-data\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.919415 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-combined-ca-bundle\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.926265 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-job-config-data\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.929835 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-config-data\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.934308 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-combined-ca-bundle\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:04 crc kubenswrapper[4717]: I1007 14:12:04.947840 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4mp\" (UniqueName: \"kubernetes.io/projected/1703fd90-e328-4f67-850f-38f8663dd2c2-kube-api-access-qb4mp\") pod \"manila-db-sync-fdlxw\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.021721 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-db-sync-config-data\") pod \"barbican-db-sync-gch6l\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.021803 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-combined-ca-bundle\") pod \"barbican-db-sync-gch6l\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.021932 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-295kt\" (UniqueName: \"kubernetes.io/projected/339f6768-36c5-4856-9159-29573ce25fe8-kube-api-access-295kt\") pod \"barbican-db-sync-gch6l\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.027456 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-combined-ca-bundle\") pod \"barbican-db-sync-gch6l\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.029714 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-db-sync-config-data\") pod \"barbican-db-sync-gch6l\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.039554 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-295kt\" (UniqueName: \"kubernetes.io/projected/339f6768-36c5-4856-9159-29573ce25fe8-kube-api-access-295kt\") pod \"barbican-db-sync-gch6l\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.045101 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fdlxw" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.214937 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gch6l" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.221627 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8w5dj"] Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.223367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.231562 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pxttm" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.233584 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.233854 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.266188 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8w5dj"] Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.272150 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.325978 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-config\") pod \"neutron-db-sync-8w5dj\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.326116 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clb48\" (UniqueName: \"kubernetes.io/projected/cb6f9762-a7d3-48ee-97ce-57439f4ee323-kube-api-access-clb48\") pod \"neutron-db-sync-8w5dj\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.326157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-combined-ca-bundle\") pod \"neutron-db-sync-8w5dj\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.430207 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clb48\" (UniqueName: \"kubernetes.io/projected/cb6f9762-a7d3-48ee-97ce-57439f4ee323-kube-api-access-clb48\") pod \"neutron-db-sync-8w5dj\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.430287 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-combined-ca-bundle\") pod \"neutron-db-sync-8w5dj\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.430376 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-config\") pod \"neutron-db-sync-8w5dj\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.437108 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-combined-ca-bundle\") pod \"neutron-db-sync-8w5dj\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.448446 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-config\") pod \"neutron-db-sync-8w5dj\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.448576 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clb48\" (UniqueName: \"kubernetes.io/projected/cb6f9762-a7d3-48ee-97ce-57439f4ee323-kube-api-access-clb48\") pod \"neutron-db-sync-8w5dj\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.726360 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.750793 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-fdlxw"] Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.769439 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gch6l"] Oct 07 14:12:05 crc kubenswrapper[4717]: W1007 14:12:05.807954 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod339f6768_36c5_4856_9159_29573ce25fe8.slice/crio-b887a6a89c7c2e78f8f10ffe1186815f4ba459f665d8cd95dbc59671d97d90b8 WatchSource:0}: Error finding container b887a6a89c7c2e78f8f10ffe1186815f4ba459f665d8cd95dbc59671d97d90b8: Status 404 returned error can't find the container with id b887a6a89c7c2e78f8f10ffe1186815f4ba459f665d8cd95dbc59671d97d90b8 Oct 07 14:12:05 crc kubenswrapper[4717]: I1007 14:12:05.866145 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:12:06 crc kubenswrapper[4717]: I1007 14:12:06.178255 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gch6l" event={"ID":"339f6768-36c5-4856-9159-29573ce25fe8","Type":"ContainerStarted","Data":"b887a6a89c7c2e78f8f10ffe1186815f4ba459f665d8cd95dbc59671d97d90b8"} Oct 07 14:12:06 crc kubenswrapper[4717]: I1007 14:12:06.188553 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fdlxw" event={"ID":"1703fd90-e328-4f67-850f-38f8663dd2c2","Type":"ContainerStarted","Data":"d687060f694cb1bbd361cdf0f05684c9ace6fffcb9c7bb5848a39a3af04e19e0"} Oct 07 14:12:06 crc kubenswrapper[4717]: I1007 14:12:06.190266 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1127e471-e2a5-436b-8433-c56124359062","Type":"ContainerStarted","Data":"0333526c7eb99b8b7c23896da2b98c249df98690b571453237c001843015d588"} Oct 07 14:12:06 crc kubenswrapper[4717]: I1007 14:12:06.263266 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8w5dj"] Oct 07 14:12:07 crc kubenswrapper[4717]: I1007 14:12:07.215855 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1127e471-e2a5-436b-8433-c56124359062","Type":"ContainerStarted","Data":"0f04054b9015786809c1f294f2e557e2875402fdf4d407d2b423c550565c207a"} Oct 07 14:12:07 crc kubenswrapper[4717]: I1007 14:12:07.220624 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8w5dj" event={"ID":"cb6f9762-a7d3-48ee-97ce-57439f4ee323","Type":"ContainerStarted","Data":"2f705c4c414143c30e180bd19bb50124e635fb19a70b198027dcac8cea7a78f2"} Oct 07 14:12:07 crc kubenswrapper[4717]: I1007 14:12:07.220675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8w5dj" event={"ID":"cb6f9762-a7d3-48ee-97ce-57439f4ee323","Type":"ContainerStarted","Data":"b734a64f8f26523b5331332da0eb3637c3b42df56c815e33fe192c5673687eaa"} Oct 07 14:12:08 crc kubenswrapper[4717]: I1007 14:12:08.244270 4717 generic.go:334] "Generic (PLEG): container finished" podID="de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" containerID="b6faa65930282b1c869049b210335050b683b6225fb3aba45b13724f327ae3cb" exitCode=0 Oct 07 14:12:08 crc kubenswrapper[4717]: I1007 14:12:08.244420 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z2524" event={"ID":"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3","Type":"ContainerDied","Data":"b6faa65930282b1c869049b210335050b683b6225fb3aba45b13724f327ae3cb"} Oct 07 14:12:08 crc kubenswrapper[4717]: I1007 14:12:08.266510 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8w5dj" podStartSLOduration=3.266492895 podStartE2EDuration="3.266492895s" podCreationTimestamp="2025-10-07 14:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:12:07.256362323 +0000 UTC m=+1109.084288115" watchObservedRunningTime="2025-10-07 14:12:08.266492895 +0000 UTC m=+1110.094418687" Oct 07 14:12:09 crc kubenswrapper[4717]: I1007 14:12:09.343775 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 14:12:09 crc kubenswrapper[4717]: I1007 14:12:09.344074 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 14:12:09 crc kubenswrapper[4717]: I1007 14:12:09.387912 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 14:12:09 crc kubenswrapper[4717]: I1007 14:12:09.400552 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.262771 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.263119 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.907972 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.988708 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-combined-ca-bundle\") pod \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.989088 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-fernet-keys\") pod \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.989168 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5wqw\" (UniqueName: \"kubernetes.io/projected/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-kube-api-access-l5wqw\") pod \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.989327 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-scripts\") pod \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.989354 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-credential-keys\") pod \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.989908 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-config-data\") pod \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\" (UID: \"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3\") " Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.993418 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-kube-api-access-l5wqw" (OuterVolumeSpecName: "kube-api-access-l5wqw") pod "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" (UID: "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3"). InnerVolumeSpecName "kube-api-access-l5wqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:10 crc kubenswrapper[4717]: I1007 14:12:10.994896 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-scripts" (OuterVolumeSpecName: "scripts") pod "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" (UID: "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:10.998813 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" (UID: "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.001751 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" (UID: "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.032590 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" (UID: "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.042172 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-config-data" (OuterVolumeSpecName: "config-data") pod "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" (UID: "de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.092529 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.092556 4717 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.092566 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.092576 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.092584 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.092594 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5wqw\" (UniqueName: \"kubernetes.io/projected/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3-kube-api-access-l5wqw\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.274888 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-z2524" event={"ID":"de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3","Type":"ContainerDied","Data":"b63739873cfbcaeed2a5d63ca10e4b04bc2d414287e9db4e595ad68e692ce089"} Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.274932 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b63739873cfbcaeed2a5d63ca10e4b04bc2d414287e9db4e595ad68e692ce089" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.274938 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-z2524" Oct 07 14:12:11 crc kubenswrapper[4717]: I1007 14:12:11.279046 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8f021f-8b1b-4a30-80a4-b01a299c734f","Type":"ContainerStarted","Data":"4344f6f9e56b78648e493214c9284162f44c2b10dbc248afe1e75154bd8b3de1"} Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.005267 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-74d44865f4-vrndk"] Oct 07 14:12:12 crc kubenswrapper[4717]: E1007 14:12:12.006040 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" containerName="keystone-bootstrap" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.006056 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" containerName="keystone-bootstrap" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.006307 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" containerName="keystone-bootstrap" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.007037 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.023419 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.023699 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.023863 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.023931 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-m4h7z" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.023880 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.024174 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.042399 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-74d44865f4-vrndk"] Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.109522 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj6tz\" (UniqueName: \"kubernetes.io/projected/230706b7-0a7f-4b69-973f-4e9c1253e7b9-kube-api-access-xj6tz\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.109571 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-config-data\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.109589 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-credential-keys\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.109617 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-combined-ca-bundle\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.109781 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-scripts\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.109949 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-public-tls-certs\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.110037 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-internal-tls-certs\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.110140 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-fernet-keys\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.212064 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj6tz\" (UniqueName: \"kubernetes.io/projected/230706b7-0a7f-4b69-973f-4e9c1253e7b9-kube-api-access-xj6tz\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.212121 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-config-data\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.212142 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-credential-keys\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.212176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-combined-ca-bundle\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.212226 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-scripts\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.212290 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-public-tls-certs\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.212319 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-internal-tls-certs\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.212406 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-fernet-keys\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.221419 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-scripts\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.225466 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-fernet-keys\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.228108 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-config-data\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.229323 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-credential-keys\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.233596 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj6tz\" (UniqueName: \"kubernetes.io/projected/230706b7-0a7f-4b69-973f-4e9c1253e7b9-kube-api-access-xj6tz\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.234217 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-internal-tls-certs\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.236965 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-public-tls-certs\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.246946 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/230706b7-0a7f-4b69-973f-4e9c1253e7b9-combined-ca-bundle\") pod \"keystone-74d44865f4-vrndk\" (UID: \"230706b7-0a7f-4b69-973f-4e9c1253e7b9\") " pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.296703 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1127e471-e2a5-436b-8433-c56124359062","Type":"ContainerStarted","Data":"1f7b6873101b80934ebcd8b04468facfc3035d0249009cd29f3246b3808ae2ab"} Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.327912 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.327895219 podStartE2EDuration="8.327895219s" podCreationTimestamp="2025-10-07 14:12:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:12:12.3243049 +0000 UTC m=+1114.152230692" watchObservedRunningTime="2025-10-07 14:12:12.327895219 +0000 UTC m=+1114.155821011" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.355353 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.750656 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.750763 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 14:12:12 crc kubenswrapper[4717]: I1007 14:12:12.751369 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 14:12:14 crc kubenswrapper[4717]: I1007 14:12:14.476039 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5b6cc95fd6-f8jf5" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 07 14:12:14 crc kubenswrapper[4717]: I1007 14:12:14.562438 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 14:12:14 crc kubenswrapper[4717]: I1007 14:12:14.562552 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 14:12:14 crc kubenswrapper[4717]: I1007 14:12:14.602113 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 14:12:14 crc kubenswrapper[4717]: I1007 14:12:14.603305 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-ff6468b6d-j9vqb" podUID="fabca24d-42a9-45e6-81ca-ad04bf8bd588" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 07 14:12:14 crc kubenswrapper[4717]: I1007 14:12:14.609478 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 14:12:15 crc kubenswrapper[4717]: I1007 14:12:15.321865 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 14:12:15 crc kubenswrapper[4717]: I1007 14:12:15.321913 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 14:12:17 crc kubenswrapper[4717]: I1007 14:12:17.589932 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 14:12:18 crc kubenswrapper[4717]: I1007 14:12:18.493687 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 14:12:26 crc kubenswrapper[4717]: E1007 14:12:26.261450 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 07 14:12:26 crc kubenswrapper[4717]: E1007 14:12:26.262040 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tlgc8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-tjhjz_openstack(8dd33a1d-9592-465a-8285-941a03e92fa4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 14:12:26 crc kubenswrapper[4717]: E1007 14:12:26.263241 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-tjhjz" podUID="8dd33a1d-9592-465a-8285-941a03e92fa4" Oct 07 14:12:26 crc kubenswrapper[4717]: I1007 14:12:26.391998 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:12:26 crc kubenswrapper[4717]: E1007 14:12:26.420016 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-tjhjz" podUID="8dd33a1d-9592-465a-8285-941a03e92fa4" Oct 07 14:12:26 crc kubenswrapper[4717]: I1007 14:12:26.622665 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:12:28 crc kubenswrapper[4717]: I1007 14:12:28.100424 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:12:28 crc kubenswrapper[4717]: I1007 14:12:28.372443 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-ff6468b6d-j9vqb" Oct 07 14:12:28 crc kubenswrapper[4717]: I1007 14:12:28.460114 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b6cc95fd6-f8jf5"] Oct 07 14:12:28 crc kubenswrapper[4717]: I1007 14:12:28.460927 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b6cc95fd6-f8jf5" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon-log" containerID="cri-o://de5625b664832e995fc93a9a6085e4a0819c974d6527e6d58b235012ed3d3157" gracePeriod=30 Oct 07 14:12:28 crc kubenswrapper[4717]: I1007 14:12:28.461195 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5b6cc95fd6-f8jf5" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon" containerID="cri-o://1be9c8727e96c507158a78f88891832a24bc2136e825aa65da4b1531801f787f" gracePeriod=30 Oct 07 14:12:30 crc kubenswrapper[4717]: E1007 14:12:30.598600 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 07 14:12:30 crc kubenswrapper[4717]: E1007 14:12:30.599109 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-295kt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-gch6l_openstack(339f6768-36c5-4856-9159-29573ce25fe8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 14:12:30 crc kubenswrapper[4717]: E1007 14:12:30.600251 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-gch6l" podUID="339f6768-36c5-4856-9159-29573ce25fe8" Oct 07 14:12:31 crc kubenswrapper[4717]: E1007 14:12:31.210169 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-manila-api:current-podified" Oct 07 14:12:31 crc kubenswrapper[4717]: E1007 14:12:31.211344 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manila-db-sync,Image:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,Command:[/bin/bash],Args:[-c sleep 0 && /usr/bin/manila-manage --config-dir /etc/manila/manila.conf.d db sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:job-config-data,ReadOnly:true,MountPath:/etc/manila/manila.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qb4mp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42429,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42429,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-db-sync-fdlxw_openstack(1703fd90-e328-4f67-850f-38f8663dd2c2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 14:12:31 crc kubenswrapper[4717]: E1007 14:12:31.212662 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/manila-db-sync-fdlxw" podUID="1703fd90-e328-4f67-850f-38f8663dd2c2" Oct 07 14:12:31 crc kubenswrapper[4717]: E1007 14:12:31.480697 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manila-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-manila-api:current-podified\\\"\"" pod="openstack/manila-db-sync-fdlxw" podUID="1703fd90-e328-4f67-850f-38f8663dd2c2" Oct 07 14:12:31 crc kubenswrapper[4717]: E1007 14:12:31.484480 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-gch6l" podUID="339f6768-36c5-4856-9159-29573ce25fe8" Oct 07 14:12:31 crc kubenswrapper[4717]: I1007 14:12:31.609987 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:12:31 crc kubenswrapper[4717]: I1007 14:12:31.610057 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:12:31 crc kubenswrapper[4717]: I1007 14:12:31.717923 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-74d44865f4-vrndk"] Oct 07 14:12:32 crc kubenswrapper[4717]: I1007 14:12:32.489908 4717 generic.go:334] "Generic (PLEG): container finished" podID="8056f9f9-b82b-4309-a225-241d2a7ba680" containerID="5dea6e2e805cbbff195d88cd91cf7a6015c6ecfb7f01b84d946bb0bda96b0050" exitCode=137 Oct 07 14:12:32 crc kubenswrapper[4717]: I1007 14:12:32.490192 4717 generic.go:334] "Generic (PLEG): container finished" podID="8056f9f9-b82b-4309-a225-241d2a7ba680" containerID="020423bd11264a6b25390d927315cc8292362dab4ac1e3794faf30a8bed66485" exitCode=137 Oct 07 14:12:32 crc kubenswrapper[4717]: I1007 14:12:32.490000 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-766954cfd9-kkf7x" event={"ID":"8056f9f9-b82b-4309-a225-241d2a7ba680","Type":"ContainerDied","Data":"5dea6e2e805cbbff195d88cd91cf7a6015c6ecfb7f01b84d946bb0bda96b0050"} Oct 07 14:12:32 crc kubenswrapper[4717]: I1007 14:12:32.490272 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-766954cfd9-kkf7x" event={"ID":"8056f9f9-b82b-4309-a225-241d2a7ba680","Type":"ContainerDied","Data":"020423bd11264a6b25390d927315cc8292362dab4ac1e3794faf30a8bed66485"} Oct 07 14:12:32 crc kubenswrapper[4717]: I1007 14:12:32.494677 4717 generic.go:334] "Generic (PLEG): container finished" podID="737597e8-3ae8-4847-b38f-99644001bd0e" containerID="1be9c8727e96c507158a78f88891832a24bc2136e825aa65da4b1531801f787f" exitCode=0 Oct 07 14:12:32 crc kubenswrapper[4717]: I1007 14:12:32.494701 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b6cc95fd6-f8jf5" event={"ID":"737597e8-3ae8-4847-b38f-99644001bd0e","Type":"ContainerDied","Data":"1be9c8727e96c507158a78f88891832a24bc2136e825aa65da4b1531801f787f"} Oct 07 14:12:34 crc kubenswrapper[4717]: I1007 14:12:34.474817 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b6cc95fd6-f8jf5" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 07 14:12:35 crc kubenswrapper[4717]: W1007 14:12:35.662242 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod230706b7_0a7f_4b69_973f_4e9c1253e7b9.slice/crio-b8e576c68aecfcf4f2f69acd5075543d7b7a20f24ac69903b9ba5937c8a1a3cc WatchSource:0}: Error finding container b8e576c68aecfcf4f2f69acd5075543d7b7a20f24ac69903b9ba5937c8a1a3cc: Status 404 returned error can't find the container with id b8e576c68aecfcf4f2f69acd5075543d7b7a20f24ac69903b9ba5937c8a1a3cc Oct 07 14:12:36 crc kubenswrapper[4717]: I1007 14:12:36.525668 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-74d44865f4-vrndk" event={"ID":"230706b7-0a7f-4b69-973f-4e9c1253e7b9","Type":"ContainerStarted","Data":"b8e576c68aecfcf4f2f69acd5075543d7b7a20f24ac69903b9ba5937c8a1a3cc"} Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.758550 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.963717 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhzm7\" (UniqueName: \"kubernetes.io/projected/8056f9f9-b82b-4309-a225-241d2a7ba680-kube-api-access-xhzm7\") pod \"8056f9f9-b82b-4309-a225-241d2a7ba680\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.964477 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8056f9f9-b82b-4309-a225-241d2a7ba680-logs\") pod \"8056f9f9-b82b-4309-a225-241d2a7ba680\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.964507 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-config-data\") pod \"8056f9f9-b82b-4309-a225-241d2a7ba680\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.964535 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8056f9f9-b82b-4309-a225-241d2a7ba680-horizon-secret-key\") pod \"8056f9f9-b82b-4309-a225-241d2a7ba680\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.964574 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-scripts\") pod \"8056f9f9-b82b-4309-a225-241d2a7ba680\" (UID: \"8056f9f9-b82b-4309-a225-241d2a7ba680\") " Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.964825 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8056f9f9-b82b-4309-a225-241d2a7ba680-logs" (OuterVolumeSpecName: "logs") pod "8056f9f9-b82b-4309-a225-241d2a7ba680" (UID: "8056f9f9-b82b-4309-a225-241d2a7ba680"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.964957 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8056f9f9-b82b-4309-a225-241d2a7ba680-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.968874 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8056f9f9-b82b-4309-a225-241d2a7ba680-kube-api-access-xhzm7" (OuterVolumeSpecName: "kube-api-access-xhzm7") pod "8056f9f9-b82b-4309-a225-241d2a7ba680" (UID: "8056f9f9-b82b-4309-a225-241d2a7ba680"). InnerVolumeSpecName "kube-api-access-xhzm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.969647 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8056f9f9-b82b-4309-a225-241d2a7ba680-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8056f9f9-b82b-4309-a225-241d2a7ba680" (UID: "8056f9f9-b82b-4309-a225-241d2a7ba680"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.989358 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-config-data" (OuterVolumeSpecName: "config-data") pod "8056f9f9-b82b-4309-a225-241d2a7ba680" (UID: "8056f9f9-b82b-4309-a225-241d2a7ba680"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:38 crc kubenswrapper[4717]: I1007 14:12:38.990289 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-scripts" (OuterVolumeSpecName: "scripts") pod "8056f9f9-b82b-4309-a225-241d2a7ba680" (UID: "8056f9f9-b82b-4309-a225-241d2a7ba680"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.065974 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhzm7\" (UniqueName: \"kubernetes.io/projected/8056f9f9-b82b-4309-a225-241d2a7ba680-kube-api-access-xhzm7\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.066250 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.066262 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8056f9f9-b82b-4309-a225-241d2a7ba680-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.066272 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8056f9f9-b82b-4309-a225-241d2a7ba680-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.635274 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8f021f-8b1b-4a30-80a4-b01a299c734f","Type":"ContainerStarted","Data":"3a9c4874c5ae9b373fb756c83805e273c55f220baefedbb7c2b2dc55f02479cc"} Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.635494 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="ceilometer-central-agent" containerID="cri-o://6e5268975f6e2494483aafa7302966facb46750f60a301e3bc96fc9184b452d3" gracePeriod=30 Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.635572 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.635890 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="proxy-httpd" containerID="cri-o://3a9c4874c5ae9b373fb756c83805e273c55f220baefedbb7c2b2dc55f02479cc" gracePeriod=30 Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.635937 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="sg-core" containerID="cri-o://4344f6f9e56b78648e493214c9284162f44c2b10dbc248afe1e75154bd8b3de1" gracePeriod=30 Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.635971 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="ceilometer-notification-agent" containerID="cri-o://db185d63ce16d27abd278e7cefeb63a1e6642eaac935f05068f16f941de4bb0d" gracePeriod=30 Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.670590 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-766954cfd9-kkf7x" event={"ID":"8056f9f9-b82b-4309-a225-241d2a7ba680","Type":"ContainerDied","Data":"bb058dd907c6d136b448a167a7ebba016e2380acd6679d17ef4cda5da83c9683"} Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.670643 4717 scope.go:117] "RemoveContainer" containerID="5dea6e2e805cbbff195d88cd91cf7a6015c6ecfb7f01b84d946bb0bda96b0050" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.670705 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-766954cfd9-kkf7x" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.676640 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qmft2" event={"ID":"eb1a5ed9-e123-447d-a56d-e0cce35eb56a","Type":"ContainerStarted","Data":"729507490ead88b8332b0b90d09b340fc07490c4ed508bd1cf56e705bfc50819"} Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.678742 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-74d44865f4-vrndk" event={"ID":"230706b7-0a7f-4b69-973f-4e9c1253e7b9","Type":"ContainerStarted","Data":"462451196ff7c1f5376410bd2306e8ce74a60f373cb6944493166bbe5ad4c507"} Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.680276 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.690964 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.021921928 podStartE2EDuration="59.69094226s" podCreationTimestamp="2025-10-07 14:11:40 +0000 UTC" firstStartedPulling="2025-10-07 14:11:42.043757539 +0000 UTC m=+1083.871683331" lastFinishedPulling="2025-10-07 14:12:38.712777861 +0000 UTC m=+1140.540703663" observedRunningTime="2025-10-07 14:12:39.683878185 +0000 UTC m=+1141.511803987" watchObservedRunningTime="2025-10-07 14:12:39.69094226 +0000 UTC m=+1141.518868052" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.701919 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tjhjz" event={"ID":"8dd33a1d-9592-465a-8285-941a03e92fa4","Type":"ContainerStarted","Data":"83bd414aff4dec2a36f523bbb9ab67ac041ffb1a60090ae72f3bac9be7cbec45"} Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.728231 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-qmft2" podStartSLOduration=2.1990056 podStartE2EDuration="58.728207509s" podCreationTimestamp="2025-10-07 14:11:41 +0000 UTC" firstStartedPulling="2025-10-07 14:11:42.046499975 +0000 UTC m=+1083.874425767" lastFinishedPulling="2025-10-07 14:12:38.575701894 +0000 UTC m=+1140.403627676" observedRunningTime="2025-10-07 14:12:39.717659578 +0000 UTC m=+1141.545585370" watchObservedRunningTime="2025-10-07 14:12:39.728207509 +0000 UTC m=+1141.556133301" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.746610 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-74d44865f4-vrndk" podStartSLOduration=28.746589147 podStartE2EDuration="28.746589147s" podCreationTimestamp="2025-10-07 14:12:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:12:39.735799249 +0000 UTC m=+1141.563725041" watchObservedRunningTime="2025-10-07 14:12:39.746589147 +0000 UTC m=+1141.574514939" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.768156 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-766954cfd9-kkf7x"] Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.792328 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-tjhjz" podStartSLOduration=7.954723628 podStartE2EDuration="45.79230926s" podCreationTimestamp="2025-10-07 14:11:54 +0000 UTC" firstStartedPulling="2025-10-07 14:12:00.852317917 +0000 UTC m=+1102.680243709" lastFinishedPulling="2025-10-07 14:12:38.689903549 +0000 UTC m=+1140.517829341" observedRunningTime="2025-10-07 14:12:39.774841367 +0000 UTC m=+1141.602767159" watchObservedRunningTime="2025-10-07 14:12:39.79230926 +0000 UTC m=+1141.620235052" Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.792545 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-766954cfd9-kkf7x"] Oct 07 14:12:39 crc kubenswrapper[4717]: I1007 14:12:39.941231 4717 scope.go:117] "RemoveContainer" containerID="020423bd11264a6b25390d927315cc8292362dab4ac1e3794faf30a8bed66485" Oct 07 14:12:40 crc kubenswrapper[4717]: I1007 14:12:40.712504 4717 generic.go:334] "Generic (PLEG): container finished" podID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerID="3a9c4874c5ae9b373fb756c83805e273c55f220baefedbb7c2b2dc55f02479cc" exitCode=0 Oct 07 14:12:40 crc kubenswrapper[4717]: I1007 14:12:40.713142 4717 generic.go:334] "Generic (PLEG): container finished" podID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerID="4344f6f9e56b78648e493214c9284162f44c2b10dbc248afe1e75154bd8b3de1" exitCode=2 Oct 07 14:12:40 crc kubenswrapper[4717]: I1007 14:12:40.713267 4717 generic.go:334] "Generic (PLEG): container finished" podID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerID="6e5268975f6e2494483aafa7302966facb46750f60a301e3bc96fc9184b452d3" exitCode=0 Oct 07 14:12:40 crc kubenswrapper[4717]: I1007 14:12:40.712530 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8f021f-8b1b-4a30-80a4-b01a299c734f","Type":"ContainerDied","Data":"3a9c4874c5ae9b373fb756c83805e273c55f220baefedbb7c2b2dc55f02479cc"} Oct 07 14:12:40 crc kubenswrapper[4717]: I1007 14:12:40.713564 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8f021f-8b1b-4a30-80a4-b01a299c734f","Type":"ContainerDied","Data":"4344f6f9e56b78648e493214c9284162f44c2b10dbc248afe1e75154bd8b3de1"} Oct 07 14:12:40 crc kubenswrapper[4717]: I1007 14:12:40.714122 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8f021f-8b1b-4a30-80a4-b01a299c734f","Type":"ContainerDied","Data":"6e5268975f6e2494483aafa7302966facb46750f60a301e3bc96fc9184b452d3"} Oct 07 14:12:40 crc kubenswrapper[4717]: I1007 14:12:40.877324 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8056f9f9-b82b-4309-a225-241d2a7ba680" path="/var/lib/kubelet/pods/8056f9f9-b82b-4309-a225-241d2a7ba680/volumes" Oct 07 14:12:42 crc kubenswrapper[4717]: I1007 14:12:42.742885 4717 generic.go:334] "Generic (PLEG): container finished" podID="eb1a5ed9-e123-447d-a56d-e0cce35eb56a" containerID="729507490ead88b8332b0b90d09b340fc07490c4ed508bd1cf56e705bfc50819" exitCode=0 Oct 07 14:12:42 crc kubenswrapper[4717]: I1007 14:12:42.743052 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qmft2" event={"ID":"eb1a5ed9-e123-447d-a56d-e0cce35eb56a","Type":"ContainerDied","Data":"729507490ead88b8332b0b90d09b340fc07490c4ed508bd1cf56e705bfc50819"} Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.078659 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qmft2" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.171155 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-scripts\") pod \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.171236 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-logs\") pod \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.171319 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfjjl\" (UniqueName: \"kubernetes.io/projected/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-kube-api-access-qfjjl\") pod \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.171349 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-combined-ca-bundle\") pod \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.171472 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-config-data\") pod \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\" (UID: \"eb1a5ed9-e123-447d-a56d-e0cce35eb56a\") " Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.171763 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-logs" (OuterVolumeSpecName: "logs") pod "eb1a5ed9-e123-447d-a56d-e0cce35eb56a" (UID: "eb1a5ed9-e123-447d-a56d-e0cce35eb56a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.172178 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.176126 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-scripts" (OuterVolumeSpecName: "scripts") pod "eb1a5ed9-e123-447d-a56d-e0cce35eb56a" (UID: "eb1a5ed9-e123-447d-a56d-e0cce35eb56a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.176280 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-kube-api-access-qfjjl" (OuterVolumeSpecName: "kube-api-access-qfjjl") pod "eb1a5ed9-e123-447d-a56d-e0cce35eb56a" (UID: "eb1a5ed9-e123-447d-a56d-e0cce35eb56a"). InnerVolumeSpecName "kube-api-access-qfjjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.196334 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-config-data" (OuterVolumeSpecName: "config-data") pod "eb1a5ed9-e123-447d-a56d-e0cce35eb56a" (UID: "eb1a5ed9-e123-447d-a56d-e0cce35eb56a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.204045 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb1a5ed9-e123-447d-a56d-e0cce35eb56a" (UID: "eb1a5ed9-e123-447d-a56d-e0cce35eb56a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.273594 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfjjl\" (UniqueName: \"kubernetes.io/projected/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-kube-api-access-qfjjl\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.273633 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.273645 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.273654 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb1a5ed9-e123-447d-a56d-e0cce35eb56a-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.474529 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b6cc95fd6-f8jf5" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.761291 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-qmft2" event={"ID":"eb1a5ed9-e123-447d-a56d-e0cce35eb56a","Type":"ContainerDied","Data":"abd0d61d8620a77edcb9f1be000cf5e05efca5337bb190b804197adc26169166"} Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.761589 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abd0d61d8620a77edcb9f1be000cf5e05efca5337bb190b804197adc26169166" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.761613 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-qmft2" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.762917 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fdlxw" event={"ID":"1703fd90-e328-4f67-850f-38f8663dd2c2","Type":"ContainerStarted","Data":"b93e480bbb5018f3ccdd3bd336d0576e5c6b868c33875b4ba82588ae48cff50f"} Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.783277 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-fdlxw" podStartSLOduration=3.275245183 podStartE2EDuration="40.783258001s" podCreationTimestamp="2025-10-07 14:12:04 +0000 UTC" firstStartedPulling="2025-10-07 14:12:05.796270021 +0000 UTC m=+1107.624195813" lastFinishedPulling="2025-10-07 14:12:43.304282839 +0000 UTC m=+1145.132208631" observedRunningTime="2025-10-07 14:12:44.778955392 +0000 UTC m=+1146.606881204" watchObservedRunningTime="2025-10-07 14:12:44.783258001 +0000 UTC m=+1146.611183793" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.904612 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5848b6684d-cddjh"] Oct 07 14:12:44 crc kubenswrapper[4717]: E1007 14:12:44.905082 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb1a5ed9-e123-447d-a56d-e0cce35eb56a" containerName="placement-db-sync" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.905106 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb1a5ed9-e123-447d-a56d-e0cce35eb56a" containerName="placement-db-sync" Oct 07 14:12:44 crc kubenswrapper[4717]: E1007 14:12:44.905145 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8056f9f9-b82b-4309-a225-241d2a7ba680" containerName="horizon-log" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.905153 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8056f9f9-b82b-4309-a225-241d2a7ba680" containerName="horizon-log" Oct 07 14:12:44 crc kubenswrapper[4717]: E1007 14:12:44.905169 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8056f9f9-b82b-4309-a225-241d2a7ba680" containerName="horizon" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.905176 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8056f9f9-b82b-4309-a225-241d2a7ba680" containerName="horizon" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.905379 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8056f9f9-b82b-4309-a225-241d2a7ba680" containerName="horizon-log" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.905403 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb1a5ed9-e123-447d-a56d-e0cce35eb56a" containerName="placement-db-sync" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.905428 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8056f9f9-b82b-4309-a225-241d2a7ba680" containerName="horizon" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.906599 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.909106 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.909440 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.909573 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nx2x7" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.909692 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.910187 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.926851 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5848b6684d-cddjh"] Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.993732 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-scripts\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.993786 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-logs\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.993823 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-public-tls-certs\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.993875 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-internal-tls-certs\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.993959 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-combined-ca-bundle\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.994085 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-config-data\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:44 crc kubenswrapper[4717]: I1007 14:12:44.994114 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkcjr\" (UniqueName: \"kubernetes.io/projected/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-kube-api-access-bkcjr\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.095803 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-scripts\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.095873 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-logs\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.095930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-public-tls-certs\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.095968 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-internal-tls-certs\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.096027 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-combined-ca-bundle\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.096089 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-config-data\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.096118 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkcjr\" (UniqueName: \"kubernetes.io/projected/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-kube-api-access-bkcjr\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.096981 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-logs\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.105971 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-public-tls-certs\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.106058 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-combined-ca-bundle\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.106084 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-internal-tls-certs\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.107072 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-scripts\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.114966 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-config-data\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.117352 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkcjr\" (UniqueName: \"kubernetes.io/projected/bd72a516-7ee9-4ff2-a7b0-f928a10e676d-kube-api-access-bkcjr\") pod \"placement-5848b6684d-cddjh\" (UID: \"bd72a516-7ee9-4ff2-a7b0-f928a10e676d\") " pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:45 crc kubenswrapper[4717]: I1007 14:12:45.225801 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:51 crc kubenswrapper[4717]: I1007 14:12:51.723893 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5848b6684d-cddjh"] Oct 07 14:12:51 crc kubenswrapper[4717]: I1007 14:12:51.819430 4717 generic.go:334] "Generic (PLEG): container finished" podID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerID="db185d63ce16d27abd278e7cefeb63a1e6642eaac935f05068f16f941de4bb0d" exitCode=0 Oct 07 14:12:51 crc kubenswrapper[4717]: I1007 14:12:51.819494 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8f021f-8b1b-4a30-80a4-b01a299c734f","Type":"ContainerDied","Data":"db185d63ce16d27abd278e7cefeb63a1e6642eaac935f05068f16f941de4bb0d"} Oct 07 14:12:51 crc kubenswrapper[4717]: I1007 14:12:51.822133 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5848b6684d-cddjh" event={"ID":"bd72a516-7ee9-4ff2-a7b0-f928a10e676d","Type":"ContainerStarted","Data":"14d1146ba284984aef4ea6f04dad0b7a7876500ee8d3b0002a068e5572572179"} Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.059550 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.114294 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-log-httpd\") pod \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.114540 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-config-data\") pod \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.114660 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-combined-ca-bundle\") pod \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.114754 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-scripts\") pod \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.114856 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkjss\" (UniqueName: \"kubernetes.io/projected/6d8f021f-8b1b-4a30-80a4-b01a299c734f-kube-api-access-rkjss\") pod \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.114994 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6d8f021f-8b1b-4a30-80a4-b01a299c734f" (UID: "6d8f021f-8b1b-4a30-80a4-b01a299c734f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.115044 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-run-httpd\") pod \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.115250 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-sg-core-conf-yaml\") pod \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.115466 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6d8f021f-8b1b-4a30-80a4-b01a299c734f" (UID: "6d8f021f-8b1b-4a30-80a4-b01a299c734f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.115836 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.115927 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6d8f021f-8b1b-4a30-80a4-b01a299c734f-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.120136 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-scripts" (OuterVolumeSpecName: "scripts") pod "6d8f021f-8b1b-4a30-80a4-b01a299c734f" (UID: "6d8f021f-8b1b-4a30-80a4-b01a299c734f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.120181 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8f021f-8b1b-4a30-80a4-b01a299c734f-kube-api-access-rkjss" (OuterVolumeSpecName: "kube-api-access-rkjss") pod "6d8f021f-8b1b-4a30-80a4-b01a299c734f" (UID: "6d8f021f-8b1b-4a30-80a4-b01a299c734f"). InnerVolumeSpecName "kube-api-access-rkjss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.146103 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6d8f021f-8b1b-4a30-80a4-b01a299c734f" (UID: "6d8f021f-8b1b-4a30-80a4-b01a299c734f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.192907 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d8f021f-8b1b-4a30-80a4-b01a299c734f" (UID: "6d8f021f-8b1b-4a30-80a4-b01a299c734f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.217134 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-config-data" (OuterVolumeSpecName: "config-data") pod "6d8f021f-8b1b-4a30-80a4-b01a299c734f" (UID: "6d8f021f-8b1b-4a30-80a4-b01a299c734f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.217236 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-config-data\") pod \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\" (UID: \"6d8f021f-8b1b-4a30-80a4-b01a299c734f\") " Oct 07 14:12:52 crc kubenswrapper[4717]: W1007 14:12:52.217365 4717 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6d8f021f-8b1b-4a30-80a4-b01a299c734f/volumes/kubernetes.io~secret/config-data Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.217378 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-config-data" (OuterVolumeSpecName: "config-data") pod "6d8f021f-8b1b-4a30-80a4-b01a299c734f" (UID: "6d8f021f-8b1b-4a30-80a4-b01a299c734f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.217794 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.217810 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.217820 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.217829 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d8f021f-8b1b-4a30-80a4-b01a299c734f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.217838 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkjss\" (UniqueName: \"kubernetes.io/projected/6d8f021f-8b1b-4a30-80a4-b01a299c734f-kube-api-access-rkjss\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.831706 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gch6l" event={"ID":"339f6768-36c5-4856-9159-29573ce25fe8","Type":"ContainerStarted","Data":"4e9d004ce614d6e73bd50f6a9aabd4d649b982cdde84bcad0d197112347c667a"} Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.833636 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5848b6684d-cddjh" event={"ID":"bd72a516-7ee9-4ff2-a7b0-f928a10e676d","Type":"ContainerStarted","Data":"774660271d63fc5bb20cd5e20780f928c702d91d97da2d52a0e1161a58127b8d"} Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.833664 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5848b6684d-cddjh" event={"ID":"bd72a516-7ee9-4ff2-a7b0-f928a10e676d","Type":"ContainerStarted","Data":"f5d722a2518293860367d854a1c4c2da9581a03a18b916e4aeb144a97d7de904"} Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.833846 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.833988 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.836562 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6d8f021f-8b1b-4a30-80a4-b01a299c734f","Type":"ContainerDied","Data":"8ab925f1099f4dd7df528bdf628e09192f89ce11149ca6ddeba9aa3f3dc9916c"} Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.836608 4717 scope.go:117] "RemoveContainer" containerID="3a9c4874c5ae9b373fb756c83805e273c55f220baefedbb7c2b2dc55f02479cc" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.836639 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.862370 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gch6l" podStartSLOduration=2.887347439 podStartE2EDuration="48.862348614s" podCreationTimestamp="2025-10-07 14:12:04 +0000 UTC" firstStartedPulling="2025-10-07 14:12:05.846711264 +0000 UTC m=+1107.674637056" lastFinishedPulling="2025-10-07 14:12:51.821712439 +0000 UTC m=+1153.649638231" observedRunningTime="2025-10-07 14:12:52.850917739 +0000 UTC m=+1154.678843531" watchObservedRunningTime="2025-10-07 14:12:52.862348614 +0000 UTC m=+1154.690274406" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.865627 4717 scope.go:117] "RemoveContainer" containerID="4344f6f9e56b78648e493214c9284162f44c2b10dbc248afe1e75154bd8b3de1" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.889170 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5848b6684d-cddjh" podStartSLOduration=8.889147245 podStartE2EDuration="8.889147245s" podCreationTimestamp="2025-10-07 14:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:12:52.88065967 +0000 UTC m=+1154.708585462" watchObservedRunningTime="2025-10-07 14:12:52.889147245 +0000 UTC m=+1154.717073037" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.928165 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.928437 4717 scope.go:117] "RemoveContainer" containerID="db185d63ce16d27abd278e7cefeb63a1e6642eaac935f05068f16f941de4bb0d" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.956609 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.958156 4717 scope.go:117] "RemoveContainer" containerID="6e5268975f6e2494483aafa7302966facb46750f60a301e3bc96fc9184b452d3" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.965151 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:12:52 crc kubenswrapper[4717]: E1007 14:12:52.965567 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="proxy-httpd" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.965584 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="proxy-httpd" Oct 07 14:12:52 crc kubenswrapper[4717]: E1007 14:12:52.965593 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="ceilometer-central-agent" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.965600 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="ceilometer-central-agent" Oct 07 14:12:52 crc kubenswrapper[4717]: E1007 14:12:52.965620 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="ceilometer-notification-agent" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.965628 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="ceilometer-notification-agent" Oct 07 14:12:52 crc kubenswrapper[4717]: E1007 14:12:52.965650 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="sg-core" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.965655 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="sg-core" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.965821 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="ceilometer-central-agent" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.965840 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="proxy-httpd" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.965851 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="sg-core" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.965865 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" containerName="ceilometer-notification-agent" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.967627 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.969786 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.972103 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:12:52 crc kubenswrapper[4717]: I1007 14:12:52.974232 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.040414 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqwq\" (UniqueName: \"kubernetes.io/projected/4970ff22-3916-470e-aa5e-d2464fd0f905-kube-api-access-xbqwq\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.040462 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.040497 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-log-httpd\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.040527 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-config-data\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.040566 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-scripts\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.040673 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.040729 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-run-httpd\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.142751 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-scripts\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.143153 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.143302 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-run-httpd\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.143412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqwq\" (UniqueName: \"kubernetes.io/projected/4970ff22-3916-470e-aa5e-d2464fd0f905-kube-api-access-xbqwq\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.143513 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.143604 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-log-httpd\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.143692 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-run-httpd\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.143823 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-config-data\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.144209 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-log-httpd\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.147325 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-scripts\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.147547 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.149536 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.171928 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-config-data\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.177162 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqwq\" (UniqueName: \"kubernetes.io/projected/4970ff22-3916-470e-aa5e-d2464fd0f905-kube-api-access-xbqwq\") pod \"ceilometer-0\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.286434 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.749667 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:12:53 crc kubenswrapper[4717]: I1007 14:12:53.846493 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4970ff22-3916-470e-aa5e-d2464fd0f905","Type":"ContainerStarted","Data":"c82b637562d5ff50cedab51c4c4b736b0654ea6b5b5abccf6794f8a6ff3a4731"} Oct 07 14:12:54 crc kubenswrapper[4717]: I1007 14:12:54.474420 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5b6cc95fd6-f8jf5" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Oct 07 14:12:54 crc kubenswrapper[4717]: I1007 14:12:54.474528 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:12:54 crc kubenswrapper[4717]: I1007 14:12:54.880578 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d8f021f-8b1b-4a30-80a4-b01a299c734f" path="/var/lib/kubelet/pods/6d8f021f-8b1b-4a30-80a4-b01a299c734f/volumes" Oct 07 14:12:56 crc kubenswrapper[4717]: I1007 14:12:56.886070 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4970ff22-3916-470e-aa5e-d2464fd0f905","Type":"ContainerStarted","Data":"0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19"} Oct 07 14:12:58 crc kubenswrapper[4717]: I1007 14:12:58.904829 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4970ff22-3916-470e-aa5e-d2464fd0f905","Type":"ContainerStarted","Data":"3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58"} Oct 07 14:12:58 crc kubenswrapper[4717]: I1007 14:12:58.908139 4717 generic.go:334] "Generic (PLEG): container finished" podID="737597e8-3ae8-4847-b38f-99644001bd0e" containerID="de5625b664832e995fc93a9a6085e4a0819c974d6527e6d58b235012ed3d3157" exitCode=137 Oct 07 14:12:58 crc kubenswrapper[4717]: I1007 14:12:58.908172 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b6cc95fd6-f8jf5" event={"ID":"737597e8-3ae8-4847-b38f-99644001bd0e","Type":"ContainerDied","Data":"de5625b664832e995fc93a9a6085e4a0819c974d6527e6d58b235012ed3d3157"} Oct 07 14:12:58 crc kubenswrapper[4717]: I1007 14:12:58.908190 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b6cc95fd6-f8jf5" event={"ID":"737597e8-3ae8-4847-b38f-99644001bd0e","Type":"ContainerDied","Data":"1cb06ec2cd147927144c9470c2c41f876da6c35157a727a648fccc48bd8fa4c1"} Oct 07 14:12:58 crc kubenswrapper[4717]: I1007 14:12:58.908200 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cb06ec2cd147927144c9470c2c41f876da6c35157a727a648fccc48bd8fa4c1" Oct 07 14:12:58 crc kubenswrapper[4717]: I1007 14:12:58.934457 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.038566 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-tls-certs\") pod \"737597e8-3ae8-4847-b38f-99644001bd0e\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.038651 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-combined-ca-bundle\") pod \"737597e8-3ae8-4847-b38f-99644001bd0e\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.038713 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-scripts\") pod \"737597e8-3ae8-4847-b38f-99644001bd0e\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.038754 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-config-data\") pod \"737597e8-3ae8-4847-b38f-99644001bd0e\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.039321 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737597e8-3ae8-4847-b38f-99644001bd0e-logs\") pod \"737597e8-3ae8-4847-b38f-99644001bd0e\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.039399 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srjrx\" (UniqueName: \"kubernetes.io/projected/737597e8-3ae8-4847-b38f-99644001bd0e-kube-api-access-srjrx\") pod \"737597e8-3ae8-4847-b38f-99644001bd0e\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.039493 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-secret-key\") pod \"737597e8-3ae8-4847-b38f-99644001bd0e\" (UID: \"737597e8-3ae8-4847-b38f-99644001bd0e\") " Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.041480 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/737597e8-3ae8-4847-b38f-99644001bd0e-logs" (OuterVolumeSpecName: "logs") pod "737597e8-3ae8-4847-b38f-99644001bd0e" (UID: "737597e8-3ae8-4847-b38f-99644001bd0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.046546 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "737597e8-3ae8-4847-b38f-99644001bd0e" (UID: "737597e8-3ae8-4847-b38f-99644001bd0e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.046720 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737597e8-3ae8-4847-b38f-99644001bd0e-kube-api-access-srjrx" (OuterVolumeSpecName: "kube-api-access-srjrx") pod "737597e8-3ae8-4847-b38f-99644001bd0e" (UID: "737597e8-3ae8-4847-b38f-99644001bd0e"). InnerVolumeSpecName "kube-api-access-srjrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.066599 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-config-data" (OuterVolumeSpecName: "config-data") pod "737597e8-3ae8-4847-b38f-99644001bd0e" (UID: "737597e8-3ae8-4847-b38f-99644001bd0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.069092 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "737597e8-3ae8-4847-b38f-99644001bd0e" (UID: "737597e8-3ae8-4847-b38f-99644001bd0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.070475 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-scripts" (OuterVolumeSpecName: "scripts") pod "737597e8-3ae8-4847-b38f-99644001bd0e" (UID: "737597e8-3ae8-4847-b38f-99644001bd0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.086972 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "737597e8-3ae8-4847-b38f-99644001bd0e" (UID: "737597e8-3ae8-4847-b38f-99644001bd0e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.141497 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.141535 4717 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.141544 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737597e8-3ae8-4847-b38f-99644001bd0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.141553 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.141562 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/737597e8-3ae8-4847-b38f-99644001bd0e-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.141573 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/737597e8-3ae8-4847-b38f-99644001bd0e-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.141581 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srjrx\" (UniqueName: \"kubernetes.io/projected/737597e8-3ae8-4847-b38f-99644001bd0e-kube-api-access-srjrx\") on node \"crc\" DevicePath \"\"" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.923731 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b6cc95fd6-f8jf5" Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.986150 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5b6cc95fd6-f8jf5"] Oct 07 14:12:59 crc kubenswrapper[4717]: I1007 14:12:59.996421 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5b6cc95fd6-f8jf5"] Oct 07 14:13:00 crc kubenswrapper[4717]: I1007 14:13:00.880119 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" path="/var/lib/kubelet/pods/737597e8-3ae8-4847-b38f-99644001bd0e/volumes" Oct 07 14:13:00 crc kubenswrapper[4717]: I1007 14:13:00.933440 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4970ff22-3916-470e-aa5e-d2464fd0f905","Type":"ContainerStarted","Data":"a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567"} Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.609531 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.609612 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.609662 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.610242 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3d269e921064d5e2c67af781cbbaee93e5f7c52bd89888c96165e3264b80d4ba"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.610307 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://3d269e921064d5e2c67af781cbbaee93e5f7c52bd89888c96165e3264b80d4ba" gracePeriod=600 Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.944192 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="3d269e921064d5e2c67af781cbbaee93e5f7c52bd89888c96165e3264b80d4ba" exitCode=0 Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.944291 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"3d269e921064d5e2c67af781cbbaee93e5f7c52bd89888c96165e3264b80d4ba"} Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.944638 4717 scope.go:117] "RemoveContainer" containerID="90e7e61d077b49456611059f1c1a8bfe24645f6ad56a34f7d9dbdb19bbcf2fdc" Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.948690 4717 generic.go:334] "Generic (PLEG): container finished" podID="8dd33a1d-9592-465a-8285-941a03e92fa4" containerID="83bd414aff4dec2a36f523bbb9ab67ac041ffb1a60090ae72f3bac9be7cbec45" exitCode=0 Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.948807 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tjhjz" event={"ID":"8dd33a1d-9592-465a-8285-941a03e92fa4","Type":"ContainerDied","Data":"83bd414aff4dec2a36f523bbb9ab67ac041ffb1a60090ae72f3bac9be7cbec45"} Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.955939 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4970ff22-3916-470e-aa5e-d2464fd0f905","Type":"ContainerStarted","Data":"54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7"} Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.956201 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 14:13:01 crc kubenswrapper[4717]: I1007 14:13:01.992859 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.109772071 podStartE2EDuration="9.99284001s" podCreationTimestamp="2025-10-07 14:12:52 +0000 UTC" firstStartedPulling="2025-10-07 14:12:53.754638572 +0000 UTC m=+1155.582564364" lastFinishedPulling="2025-10-07 14:13:01.637706511 +0000 UTC m=+1163.465632303" observedRunningTime="2025-10-07 14:13:01.986632689 +0000 UTC m=+1163.814558481" watchObservedRunningTime="2025-10-07 14:13:01.99284001 +0000 UTC m=+1163.820765802" Oct 07 14:13:02 crc kubenswrapper[4717]: I1007 14:13:02.971692 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"80da36f335297e73db206108880448f662f17a9582de4598a70a5b6e5e4985c0"} Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.294429 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.346379 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlgc8\" (UniqueName: \"kubernetes.io/projected/8dd33a1d-9592-465a-8285-941a03e92fa4-kube-api-access-tlgc8\") pod \"8dd33a1d-9592-465a-8285-941a03e92fa4\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.346488 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-combined-ca-bundle\") pod \"8dd33a1d-9592-465a-8285-941a03e92fa4\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.346517 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-scripts\") pod \"8dd33a1d-9592-465a-8285-941a03e92fa4\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.347331 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dd33a1d-9592-465a-8285-941a03e92fa4-etc-machine-id\") pod \"8dd33a1d-9592-465a-8285-941a03e92fa4\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.347380 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-config-data\") pod \"8dd33a1d-9592-465a-8285-941a03e92fa4\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.347451 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-db-sync-config-data\") pod \"8dd33a1d-9592-465a-8285-941a03e92fa4\" (UID: \"8dd33a1d-9592-465a-8285-941a03e92fa4\") " Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.347557 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8dd33a1d-9592-465a-8285-941a03e92fa4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8dd33a1d-9592-465a-8285-941a03e92fa4" (UID: "8dd33a1d-9592-465a-8285-941a03e92fa4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.348401 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8dd33a1d-9592-465a-8285-941a03e92fa4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.353136 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8dd33a1d-9592-465a-8285-941a03e92fa4" (UID: "8dd33a1d-9592-465a-8285-941a03e92fa4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.366619 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-scripts" (OuterVolumeSpecName: "scripts") pod "8dd33a1d-9592-465a-8285-941a03e92fa4" (UID: "8dd33a1d-9592-465a-8285-941a03e92fa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.366997 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd33a1d-9592-465a-8285-941a03e92fa4-kube-api-access-tlgc8" (OuterVolumeSpecName: "kube-api-access-tlgc8") pod "8dd33a1d-9592-465a-8285-941a03e92fa4" (UID: "8dd33a1d-9592-465a-8285-941a03e92fa4"). InnerVolumeSpecName "kube-api-access-tlgc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.380128 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dd33a1d-9592-465a-8285-941a03e92fa4" (UID: "8dd33a1d-9592-465a-8285-941a03e92fa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.409871 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-config-data" (OuterVolumeSpecName: "config-data") pod "8dd33a1d-9592-465a-8285-941a03e92fa4" (UID: "8dd33a1d-9592-465a-8285-941a03e92fa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.451121 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.451161 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlgc8\" (UniqueName: \"kubernetes.io/projected/8dd33a1d-9592-465a-8285-941a03e92fa4-kube-api-access-tlgc8\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.451173 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.451184 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.451197 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd33a1d-9592-465a-8285-941a03e92fa4-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.978260 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tjhjz" event={"ID":"8dd33a1d-9592-465a-8285-941a03e92fa4","Type":"ContainerDied","Data":"227379d46b79e1fba8ee453c98e256bd808c0f197555bcbd1cd5af0f216ecc57"} Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.978312 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="227379d46b79e1fba8ee453c98e256bd808c0f197555bcbd1cd5af0f216ecc57" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.978386 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tjhjz" Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.985623 4717 generic.go:334] "Generic (PLEG): container finished" podID="339f6768-36c5-4856-9159-29573ce25fe8" containerID="4e9d004ce614d6e73bd50f6a9aabd4d649b982cdde84bcad0d197112347c667a" exitCode=0 Oct 07 14:13:03 crc kubenswrapper[4717]: I1007 14:13:03.985715 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gch6l" event={"ID":"339f6768-36c5-4856-9159-29573ce25fe8","Type":"ContainerDied","Data":"4e9d004ce614d6e73bd50f6a9aabd4d649b982cdde84bcad0d197112347c667a"} Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.308643 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:13:04 crc kubenswrapper[4717]: E1007 14:13:04.309534 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon-log" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.309551 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon-log" Oct 07 14:13:04 crc kubenswrapper[4717]: E1007 14:13:04.309575 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.309597 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon" Oct 07 14:13:04 crc kubenswrapper[4717]: E1007 14:13:04.309612 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd33a1d-9592-465a-8285-941a03e92fa4" containerName="cinder-db-sync" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.309618 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd33a1d-9592-465a-8285-941a03e92fa4" containerName="cinder-db-sync" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.309835 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon-log" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.309851 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="737597e8-3ae8-4847-b38f-99644001bd0e" containerName="horizon" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.309860 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd33a1d-9592-465a-8285-941a03e92fa4" containerName="cinder-db-sync" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.313030 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.315335 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.315653 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-zkfpb" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.315833 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.315982 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.341056 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86d8f7d9df-vgpqz"] Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.342895 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.355098 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367683 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrg9b\" (UniqueName: \"kubernetes.io/projected/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-kube-api-access-qrg9b\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367728 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drnbk\" (UniqueName: \"kubernetes.io/projected/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-kube-api-access-drnbk\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367751 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367776 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-sb\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367791 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367804 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-config\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367817 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-svc\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367898 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367925 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367941 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-swift-storage-0\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367959 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-nb\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.367995 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.377524 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d8f7d9df-vgpqz"] Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.426288 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.428117 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.438204 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.454748 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.457367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.470141 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.472884 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.472959 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.472998 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7hq4\" (UniqueName: \"kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-kube-api-access-c7hq4\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473037 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473060 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473094 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473115 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-run\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473140 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrg9b\" (UniqueName: \"kubernetes.io/projected/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-kube-api-access-qrg9b\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473187 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drnbk\" (UniqueName: \"kubernetes.io/projected/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-kube-api-access-drnbk\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473229 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-scripts\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473255 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-dev\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473276 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473331 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473350 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-sys\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473372 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-config\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473395 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-svc\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473415 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-sb\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473435 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473461 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473494 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473520 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-lib-modules\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473541 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473574 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd5rs\" (UniqueName: \"kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-kube-api-access-zd5rs\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473601 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-dev\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473625 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473644 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473673 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473698 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473743 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473767 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-sys\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473814 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473839 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473862 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473882 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-run\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473910 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-ceph\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473938 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473960 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.473982 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.474018 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-swift-storage-0\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.474041 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.474064 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-nb\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.479268 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.482225 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-swift-storage-0\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.482228 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-nb\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.484559 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-config\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.486462 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-svc\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.487952 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.489306 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-scripts\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.494929 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.495288 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.503252 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-sb\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.503327 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.509380 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrg9b\" (UniqueName: \"kubernetes.io/projected/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-kube-api-access-qrg9b\") pod \"dnsmasq-dns-86d8f7d9df-vgpqz\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.513254 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.514677 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drnbk\" (UniqueName: \"kubernetes.io/projected/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-kube-api-access-drnbk\") pod \"cinder-scheduler-0\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578698 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-lib-modules\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578717 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578752 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd5rs\" (UniqueName: \"kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-kube-api-access-zd5rs\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578773 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-dev\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578791 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578828 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578848 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578869 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578893 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.578914 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-sys\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579025 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579051 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579072 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-run\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579097 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-ceph\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579112 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579137 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579191 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579232 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579247 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579284 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579320 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579369 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7hq4\" (UniqueName: \"kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-kube-api-access-c7hq4\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579406 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579438 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579474 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579483 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-run\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-run\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579553 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579583 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-scripts\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579612 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-dev\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579633 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579680 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579700 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-sys\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579739 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.579914 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.580171 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.580229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-sys\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.580443 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.580468 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-run\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.580492 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-lib-modules\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.580569 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.580666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.581110 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.581306 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-dev\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.581352 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.581421 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.581426 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-sys\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.581581 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.581626 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.581694 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-dev\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.586850 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.587501 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.590355 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.590534 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-ceph\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.590671 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.590923 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-scripts\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.591220 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.591614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.601418 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.607704 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7hq4\" (UniqueName: \"kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-kube-api-access-c7hq4\") pod \"cinder-volume-volume1-0\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.610672 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.616532 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd5rs\" (UniqueName: \"kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-kube-api-access-zd5rs\") pod \"cinder-backup-0\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.620053 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.621617 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.630483 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.633736 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.640752 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.665468 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.683136 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a04504c-84a6-4fdc-bd94-dd20f0928872-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.683204 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a04504c-84a6-4fdc-bd94-dd20f0928872-logs\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.683253 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.683385 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-scripts\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.683421 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvmt9\" (UniqueName: \"kubernetes.io/projected/8a04504c-84a6-4fdc-bd94-dd20f0928872-kube-api-access-wvmt9\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.683447 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.683468 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.752966 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.785155 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a04504c-84a6-4fdc-bd94-dd20f0928872-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.785208 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a04504c-84a6-4fdc-bd94-dd20f0928872-logs\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.785236 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.785300 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-scripts\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.785325 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvmt9\" (UniqueName: \"kubernetes.io/projected/8a04504c-84a6-4fdc-bd94-dd20f0928872-kube-api-access-wvmt9\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.785353 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.785370 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.786444 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a04504c-84a6-4fdc-bd94-dd20f0928872-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.788389 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a04504c-84a6-4fdc-bd94-dd20f0928872-logs\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.789369 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-scripts\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.791309 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.795781 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data-custom\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.795842 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.803914 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvmt9\" (UniqueName: \"kubernetes.io/projected/8a04504c-84a6-4fdc-bd94-dd20f0928872-kube-api-access-wvmt9\") pod \"cinder-api-0\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " pod="openstack/cinder-api-0" Oct 07 14:13:04 crc kubenswrapper[4717]: I1007 14:13:04.888178 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.059633 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.140703 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.190446 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d8f7d9df-vgpqz"] Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.450712 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gch6l" Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.510746 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-295kt\" (UniqueName: \"kubernetes.io/projected/339f6768-36c5-4856-9159-29573ce25fe8-kube-api-access-295kt\") pod \"339f6768-36c5-4856-9159-29573ce25fe8\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.510896 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-db-sync-config-data\") pod \"339f6768-36c5-4856-9159-29573ce25fe8\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.511094 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-combined-ca-bundle\") pod \"339f6768-36c5-4856-9159-29573ce25fe8\" (UID: \"339f6768-36c5-4856-9159-29573ce25fe8\") " Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.532248 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/339f6768-36c5-4856-9159-29573ce25fe8-kube-api-access-295kt" (OuterVolumeSpecName: "kube-api-access-295kt") pod "339f6768-36c5-4856-9159-29573ce25fe8" (UID: "339f6768-36c5-4856-9159-29573ce25fe8"). InnerVolumeSpecName "kube-api-access-295kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.536512 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "339f6768-36c5-4856-9159-29573ce25fe8" (UID: "339f6768-36c5-4856-9159-29573ce25fe8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.556276 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "339f6768-36c5-4856-9159-29573ce25fe8" (UID: "339f6768-36c5-4856-9159-29573ce25fe8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.616388 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.616432 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-295kt\" (UniqueName: \"kubernetes.io/projected/339f6768-36c5-4856-9159-29573ce25fe8-kube-api-access-295kt\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.616448 4717 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/339f6768-36c5-4856-9159-29573ce25fe8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.684650 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.752138 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:13:05 crc kubenswrapper[4717]: I1007 14:13:05.919451 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.021183 4717 generic.go:334] "Generic (PLEG): container finished" podID="04f3bb08-dd18-42b6-ab66-c82b8f9966ff" containerID="b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3" exitCode=0 Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.021291 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" event={"ID":"04f3bb08-dd18-42b6-ab66-c82b8f9966ff","Type":"ContainerDied","Data":"b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3"} Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.021324 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" event={"ID":"04f3bb08-dd18-42b6-ab66-c82b8f9966ff","Type":"ContainerStarted","Data":"0ab788b0de9e72b2d0046c6fe769232477c34b9a39ceda650888ff233bc32542"} Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.026911 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c971bcc-76c9-43c4-9819-de5e25ed2a3f","Type":"ContainerStarted","Data":"da4ca233fe96ad22f77aa2a601fb04b7a38440f70e4d3bcaf5ad4ad4ab8ad081"} Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.031064 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3451f8ea-dab2-464f-802d-90acccb50b4d","Type":"ContainerStarted","Data":"3e9b0d73847caf4dacf73739b540618379af9787c79c82aeafa8dcc81befef58"} Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.056077 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gch6l" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.056440 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gch6l" event={"ID":"339f6768-36c5-4856-9159-29573ce25fe8","Type":"ContainerDied","Data":"b887a6a89c7c2e78f8f10ffe1186815f4ba459f665d8cd95dbc59671d97d90b8"} Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.056489 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b887a6a89c7c2e78f8f10ffe1186815f4ba459f665d8cd95dbc59671d97d90b8" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.068563 4717 generic.go:334] "Generic (PLEG): container finished" podID="cb6f9762-a7d3-48ee-97ce-57439f4ee323" containerID="2f705c4c414143c30e180bd19bb50124e635fb19a70b198027dcac8cea7a78f2" exitCode=0 Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.068633 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8w5dj" event={"ID":"cb6f9762-a7d3-48ee-97ce-57439f4ee323","Type":"ContainerDied","Data":"2f705c4c414143c30e180bd19bb50124e635fb19a70b198027dcac8cea7a78f2"} Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.079000 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"a608ec5b-d960-4b6b-946b-c9a36a38810c","Type":"ContainerStarted","Data":"e2ab7ccf24204838aa8b0a67b76effa6005eb1af8be40093c68ab39347f3ee9d"} Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.086882 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a04504c-84a6-4fdc-bd94-dd20f0928872","Type":"ContainerStarted","Data":"78ec01c531b87c9fcbdd4d8ba7388c1a4e53ed9aa188618de1e0c8b98b7c6dc2"} Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.217742 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-b648ffc67-ks8w7"] Oct 07 14:13:06 crc kubenswrapper[4717]: E1007 14:13:06.218577 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="339f6768-36c5-4856-9159-29573ce25fe8" containerName="barbican-db-sync" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.218597 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="339f6768-36c5-4856-9159-29573ce25fe8" containerName="barbican-db-sync" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.218780 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="339f6768-36c5-4856-9159-29573ce25fe8" containerName="barbican-db-sync" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.219769 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.227715 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-px6bj" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.227876 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.228168 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.229265 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c96f41ab-26d3-44e4-8ad3-732b104a09df-config-data-custom\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.229565 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c96f41ab-26d3-44e4-8ad3-732b104a09df-config-data\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.229723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f75cw\" (UniqueName: \"kubernetes.io/projected/c96f41ab-26d3-44e4-8ad3-732b104a09df-kube-api-access-f75cw\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.229763 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c96f41ab-26d3-44e4-8ad3-732b104a09df-logs\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.229780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96f41ab-26d3-44e4-8ad3-732b104a09df-combined-ca-bundle\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.243223 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-b648ffc67-ks8w7"] Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.282736 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-8474564558-fss6t"] Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.284963 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.288928 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.323507 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8474564558-fss6t"] Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.332816 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fgxb\" (UniqueName: \"kubernetes.io/projected/237c55e8-afd6-4798-b0cf-7f8b20b0e323-kube-api-access-5fgxb\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.332871 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c96f41ab-26d3-44e4-8ad3-732b104a09df-logs\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.332897 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96f41ab-26d3-44e4-8ad3-732b104a09df-combined-ca-bundle\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.332939 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/237c55e8-afd6-4798-b0cf-7f8b20b0e323-config-data\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.332960 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237c55e8-afd6-4798-b0cf-7f8b20b0e323-combined-ca-bundle\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.332994 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c96f41ab-26d3-44e4-8ad3-732b104a09df-config-data-custom\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.333081 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c96f41ab-26d3-44e4-8ad3-732b104a09df-config-data\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.333124 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/237c55e8-afd6-4798-b0cf-7f8b20b0e323-logs\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.333146 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/237c55e8-afd6-4798-b0cf-7f8b20b0e323-config-data-custom\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.333198 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f75cw\" (UniqueName: \"kubernetes.io/projected/c96f41ab-26d3-44e4-8ad3-732b104a09df-kube-api-access-f75cw\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.333847 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c96f41ab-26d3-44e4-8ad3-732b104a09df-logs\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.339867 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c96f41ab-26d3-44e4-8ad3-732b104a09df-config-data-custom\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.340377 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c96f41ab-26d3-44e4-8ad3-732b104a09df-config-data\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.351271 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c96f41ab-26d3-44e4-8ad3-732b104a09df-combined-ca-bundle\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.370503 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f75cw\" (UniqueName: \"kubernetes.io/projected/c96f41ab-26d3-44e4-8ad3-732b104a09df-kube-api-access-f75cw\") pod \"barbican-worker-b648ffc67-ks8w7\" (UID: \"c96f41ab-26d3-44e4-8ad3-732b104a09df\") " pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.395443 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d8f7d9df-vgpqz"] Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.426700 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-skt9t"] Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.428237 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440602 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440654 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-config\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440674 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440707 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/237c55e8-afd6-4798-b0cf-7f8b20b0e323-logs\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440743 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/237c55e8-afd6-4798-b0cf-7f8b20b0e323-config-data-custom\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440816 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fgxb\" (UniqueName: \"kubernetes.io/projected/237c55e8-afd6-4798-b0cf-7f8b20b0e323-kube-api-access-5fgxb\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440845 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkfcc\" (UniqueName: \"kubernetes.io/projected/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-kube-api-access-mkfcc\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440870 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/237c55e8-afd6-4798-b0cf-7f8b20b0e323-config-data\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.440888 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237c55e8-afd6-4798-b0cf-7f8b20b0e323-combined-ca-bundle\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.442244 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/237c55e8-afd6-4798-b0cf-7f8b20b0e323-logs\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.442922 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-skt9t"] Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.448984 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/237c55e8-afd6-4798-b0cf-7f8b20b0e323-config-data-custom\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.451118 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/237c55e8-afd6-4798-b0cf-7f8b20b0e323-config-data\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.456290 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/237c55e8-afd6-4798-b0cf-7f8b20b0e323-combined-ca-bundle\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.471958 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fgxb\" (UniqueName: \"kubernetes.io/projected/237c55e8-afd6-4798-b0cf-7f8b20b0e323-kube-api-access-5fgxb\") pod \"barbican-keystone-listener-8474564558-fss6t\" (UID: \"237c55e8-afd6-4798-b0cf-7f8b20b0e323\") " pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.504602 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-575df7c864-lbv94"] Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.506112 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.510595 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.522094 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-575df7c864-lbv94"] Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544098 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544144 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-logs\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544170 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544192 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-config\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544264 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data-custom\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544345 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkfcc\" (UniqueName: \"kubernetes.io/projected/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-kube-api-access-mkfcc\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544366 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-combined-ca-bundle\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.544383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqzk4\" (UniqueName: \"kubernetes.io/projected/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-kube-api-access-gqzk4\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.545093 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-config\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.545690 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.547050 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.549509 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.552618 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.567711 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b648ffc67-ks8w7" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.572657 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkfcc\" (UniqueName: \"kubernetes.io/projected/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-kube-api-access-mkfcc\") pod \"dnsmasq-dns-69c986f6d7-skt9t\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.645866 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-combined-ca-bundle\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.645913 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqzk4\" (UniqueName: \"kubernetes.io/projected/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-kube-api-access-gqzk4\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.645986 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-logs\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.646031 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.646112 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data-custom\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.653591 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-logs\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.654801 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.655922 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data-custom\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.660160 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-8474564558-fss6t" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.661637 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-combined-ca-bundle\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.673115 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqzk4\" (UniqueName: \"kubernetes.io/projected/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-kube-api-access-gqzk4\") pod \"barbican-api-575df7c864-lbv94\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.778184 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:06 crc kubenswrapper[4717]: I1007 14:13:06.799530 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.119379 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" event={"ID":"04f3bb08-dd18-42b6-ab66-c82b8f9966ff","Type":"ContainerStarted","Data":"ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349"} Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.119904 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" podUID="04f3bb08-dd18-42b6-ab66-c82b8f9966ff" containerName="dnsmasq-dns" containerID="cri-o://ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349" gracePeriod=10 Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.120097 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.194386 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" podStartSLOduration=3.194364158 podStartE2EDuration="3.194364158s" podCreationTimestamp="2025-10-07 14:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:07.163163933 +0000 UTC m=+1168.991089725" watchObservedRunningTime="2025-10-07 14:13:07.194364158 +0000 UTC m=+1169.022289950" Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.309153 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-b648ffc67-ks8w7"] Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.526446 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-8474564558-fss6t"] Oct 07 14:13:07 crc kubenswrapper[4717]: W1007 14:13:07.584831 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod237c55e8_afd6_4798_b0cf_7f8b20b0e323.slice/crio-65efcd52e3a65b9bd9cb8192eb7b50d9b8ee22c5b59744da3342c8c5c7817ffd WatchSource:0}: Error finding container 65efcd52e3a65b9bd9cb8192eb7b50d9b8ee22c5b59744da3342c8c5c7817ffd: Status 404 returned error can't find the container with id 65efcd52e3a65b9bd9cb8192eb7b50d9b8ee22c5b59744da3342c8c5c7817ffd Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.728488 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-skt9t"] Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.843758 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.881872 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-575df7c864-lbv94"] Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.914089 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:13:07 crc kubenswrapper[4717]: I1007 14:13:07.923653 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-config\") pod \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " Oct 07 14:13:07 crc kubenswrapper[4717]: W1007 14:13:07.970879 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod506e8cb1_00bf_414a_b0c5_1eb6935b0ed8.slice/crio-7589b8c423f48553d09ac8eabbdfa2c199e2b9106448b64d81ce03c45429f121 WatchSource:0}: Error finding container 7589b8c423f48553d09ac8eabbdfa2c199e2b9106448b64d81ce03c45429f121: Status 404 returned error can't find the container with id 7589b8c423f48553d09ac8eabbdfa2c199e2b9106448b64d81ce03c45429f121 Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.025418 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-combined-ca-bundle\") pod \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.025558 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clb48\" (UniqueName: \"kubernetes.io/projected/cb6f9762-a7d3-48ee-97ce-57439f4ee323-kube-api-access-clb48\") pod \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\" (UID: \"cb6f9762-a7d3-48ee-97ce-57439f4ee323\") " Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.051649 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.141093 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb6f9762-a7d3-48ee-97ce-57439f4ee323-kube-api-access-clb48" (OuterVolumeSpecName: "kube-api-access-clb48") pod "cb6f9762-a7d3-48ee-97ce-57439f4ee323" (UID: "cb6f9762-a7d3-48ee-97ce-57439f4ee323"). InnerVolumeSpecName "kube-api-access-clb48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.178361 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3451f8ea-dab2-464f-802d-90acccb50b4d","Type":"ContainerStarted","Data":"f935be5ff57334500488699d29ae60742d2c62839fcb9dd024d0433adf3a4179"} Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.194051 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8474564558-fss6t" event={"ID":"237c55e8-afd6-4798-b0cf-7f8b20b0e323","Type":"ContainerStarted","Data":"65efcd52e3a65b9bd9cb8192eb7b50d9b8ee22c5b59744da3342c8c5c7817ffd"} Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.209976 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8w5dj" event={"ID":"cb6f9762-a7d3-48ee-97ce-57439f4ee323","Type":"ContainerDied","Data":"b734a64f8f26523b5331332da0eb3637c3b42df56c815e33fe192c5673687eaa"} Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.210033 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b734a64f8f26523b5331332da0eb3637c3b42df56c815e33fe192c5673687eaa" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.210087 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.225702 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575df7c864-lbv94" event={"ID":"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8","Type":"ContainerStarted","Data":"7589b8c423f48553d09ac8eabbdfa2c199e2b9106448b64d81ce03c45429f121"} Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.230121 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-swift-storage-0\") pod \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.230182 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-svc\") pod \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.230207 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-nb\") pod \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.230308 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-sb\") pod \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.230371 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrg9b\" (UniqueName: \"kubernetes.io/projected/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-kube-api-access-qrg9b\") pod \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.230448 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-config\") pod \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\" (UID: \"04f3bb08-dd18-42b6-ab66-c82b8f9966ff\") " Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.230861 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clb48\" (UniqueName: \"kubernetes.io/projected/cb6f9762-a7d3-48ee-97ce-57439f4ee323-kube-api-access-clb48\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.246249 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" event={"ID":"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4","Type":"ContainerStarted","Data":"baeb2c40062f3d280c185b9a4b197d199a52223cd5320bf261bcef4dc6a65978"} Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.272185 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb6f9762-a7d3-48ee-97ce-57439f4ee323" (UID: "cb6f9762-a7d3-48ee-97ce-57439f4ee323"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.294199 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b648ffc67-ks8w7" event={"ID":"c96f41ab-26d3-44e4-8ad3-732b104a09df","Type":"ContainerStarted","Data":"4ddafd3f7e6ab06cfeade381409a1ca0f646a2aa1c66a9190101f949dcf54ad0"} Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.322555 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a04504c-84a6-4fdc-bd94-dd20f0928872","Type":"ContainerStarted","Data":"72a841aebd7b7eae1bd3668f4555f016e1b8cde51e81a7949aabb9a58a8ef12e"} Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.332269 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.362853 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-kube-api-access-qrg9b" (OuterVolumeSpecName: "kube-api-access-qrg9b") pod "04f3bb08-dd18-42b6-ab66-c82b8f9966ff" (UID: "04f3bb08-dd18-42b6-ab66-c82b8f9966ff"). InnerVolumeSpecName "kube-api-access-qrg9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.381212 4717 generic.go:334] "Generic (PLEG): container finished" podID="04f3bb08-dd18-42b6-ab66-c82b8f9966ff" containerID="ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349" exitCode=0 Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.381258 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" event={"ID":"04f3bb08-dd18-42b6-ab66-c82b8f9966ff","Type":"ContainerDied","Data":"ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349"} Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.381284 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" event={"ID":"04f3bb08-dd18-42b6-ab66-c82b8f9966ff","Type":"ContainerDied","Data":"0ab788b0de9e72b2d0046c6fe769232477c34b9a39ceda650888ff233bc32542"} Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.381300 4717 scope.go:117] "RemoveContainer" containerID="ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.381454 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d8f7d9df-vgpqz" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.447227 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrg9b\" (UniqueName: \"kubernetes.io/projected/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-kube-api-access-qrg9b\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.483213 4717 scope.go:117] "RemoveContainer" containerID="b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.584200 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-skt9t"] Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.637188 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-wtwxc"] Oct 07 14:13:08 crc kubenswrapper[4717]: E1007 14:13:08.637553 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f3bb08-dd18-42b6-ab66-c82b8f9966ff" containerName="dnsmasq-dns" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.637568 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f3bb08-dd18-42b6-ab66-c82b8f9966ff" containerName="dnsmasq-dns" Oct 07 14:13:08 crc kubenswrapper[4717]: E1007 14:13:08.637599 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f3bb08-dd18-42b6-ab66-c82b8f9966ff" containerName="init" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.637605 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f3bb08-dd18-42b6-ab66-c82b8f9966ff" containerName="init" Oct 07 14:13:08 crc kubenswrapper[4717]: E1007 14:13:08.637630 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6f9762-a7d3-48ee-97ce-57439f4ee323" containerName="neutron-db-sync" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.637639 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6f9762-a7d3-48ee-97ce-57439f4ee323" containerName="neutron-db-sync" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.637819 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f3bb08-dd18-42b6-ab66-c82b8f9966ff" containerName="dnsmasq-dns" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.637843 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6f9762-a7d3-48ee-97ce-57439f4ee323" containerName="neutron-db-sync" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.638721 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.719993 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b594ccc88-lfbdk"] Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.721583 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.725283 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.755160 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-wtwxc"] Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.780055 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b594ccc88-lfbdk"] Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.782590 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-svc\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.782663 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.782699 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95vxf\" (UniqueName: \"kubernetes.io/projected/26678e24-2754-454e-abdc-3add4caf4c81-kube-api-access-95vxf\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.783061 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.783104 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-config\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.783121 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.884620 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-combined-ca-bundle\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.884677 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-svc\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.884780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-config\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.884819 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.884850 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhr9r\" (UniqueName: \"kubernetes.io/projected/278a49b0-beb4-430b-ad36-065e5911394b-kube-api-access-xhr9r\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.884890 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95vxf\" (UniqueName: \"kubernetes.io/projected/26678e24-2754-454e-abdc-3add4caf4c81-kube-api-access-95vxf\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.884927 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-httpd-config\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.884959 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.884984 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-ovndb-tls-certs\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.885039 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-config\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.885071 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.888984 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.889852 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-svc\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.890465 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-config\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.890966 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.893443 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:08 crc kubenswrapper[4717]: I1007 14:13:08.943365 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-config" (OuterVolumeSpecName: "config") pod "cb6f9762-a7d3-48ee-97ce-57439f4ee323" (UID: "cb6f9762-a7d3-48ee-97ce-57439f4ee323"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:08.978879 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95vxf\" (UniqueName: \"kubernetes.io/projected/26678e24-2754-454e-abdc-3add4caf4c81-kube-api-access-95vxf\") pod \"dnsmasq-dns-5784cf869f-wtwxc\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:08.989893 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-config\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:08.989951 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhr9r\" (UniqueName: \"kubernetes.io/projected/278a49b0-beb4-430b-ad36-065e5911394b-kube-api-access-xhr9r\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:08.990002 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-httpd-config\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:08.990046 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-ovndb-tls-certs\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:08.990145 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-combined-ca-bundle\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:08.991086 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cb6f9762-a7d3-48ee-97ce-57439f4ee323-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:08.995543 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-combined-ca-bundle\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:08.996646 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-config\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:08.996972 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.010569 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-config" (OuterVolumeSpecName: "config") pod "04f3bb08-dd18-42b6-ab66-c82b8f9966ff" (UID: "04f3bb08-dd18-42b6-ab66-c82b8f9966ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.045148 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04f3bb08-dd18-42b6-ab66-c82b8f9966ff" (UID: "04f3bb08-dd18-42b6-ab66-c82b8f9966ff"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.045388 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04f3bb08-dd18-42b6-ab66-c82b8f9966ff" (UID: "04f3bb08-dd18-42b6-ab66-c82b8f9966ff"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.045519 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-httpd-config\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.048963 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhr9r\" (UniqueName: \"kubernetes.io/projected/278a49b0-beb4-430b-ad36-065e5911394b-kube-api-access-xhr9r\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.049527 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-ovndb-tls-certs\") pod \"neutron-5b594ccc88-lfbdk\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.065733 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04f3bb08-dd18-42b6-ab66-c82b8f9966ff" (UID: "04f3bb08-dd18-42b6-ab66-c82b8f9966ff"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.083453 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04f3bb08-dd18-42b6-ab66-c82b8f9966ff" (UID: "04f3bb08-dd18-42b6-ab66-c82b8f9966ff"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.094235 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.094274 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.094284 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.094294 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.094302 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04f3bb08-dd18-42b6-ab66-c82b8f9966ff-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.220168 4717 scope.go:117] "RemoveContainer" containerID="ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349" Oct 07 14:13:09 crc kubenswrapper[4717]: E1007 14:13:09.221619 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349\": container with ID starting with ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349 not found: ID does not exist" containerID="ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.221648 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349"} err="failed to get container status \"ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349\": rpc error: code = NotFound desc = could not find container \"ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349\": container with ID starting with ec190e24f427f90a569b2ccd7f6a328459c0509e2c54c5b3563d937019e49349 not found: ID does not exist" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.221671 4717 scope.go:117] "RemoveContainer" containerID="b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3" Oct 07 14:13:09 crc kubenswrapper[4717]: E1007 14:13:09.221935 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3\": container with ID starting with b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3 not found: ID does not exist" containerID="b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.221953 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3"} err="failed to get container status \"b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3\": rpc error: code = NotFound desc = could not find container \"b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3\": container with ID starting with b2fb210c6ea7e564e0ecd66940964a9b69ad9bf5a10f317be90053c844e2e5e3 not found: ID does not exist" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.362993 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.381546 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d8f7d9df-vgpqz"] Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.393529 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86d8f7d9df-vgpqz"] Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.420653 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c971bcc-76c9-43c4-9819-de5e25ed2a3f","Type":"ContainerStarted","Data":"a0e111b685bebd3cad91d5e8e0f922413b5a2d9352c054f0b49458c514a09169"} Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.423184 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.442391 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3451f8ea-dab2-464f-802d-90acccb50b4d","Type":"ContainerStarted","Data":"e3f98de8706573b91e2b44cf79affc2f7cb03d21f73a9b7347599bb3c58da4f8"} Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.450622 4717 generic.go:334] "Generic (PLEG): container finished" podID="e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" containerID="6f55bcd2e60d0022470edad864bc30b0c5534b18262a1896e83bce5f2cf58779" exitCode=0 Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.450699 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" event={"ID":"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4","Type":"ContainerDied","Data":"6f55bcd2e60d0022470edad864bc30b0c5534b18262a1896e83bce5f2cf58779"} Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.471377 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=4.395947153 podStartE2EDuration="5.471358338s" podCreationTimestamp="2025-10-07 14:13:04 +0000 UTC" firstStartedPulling="2025-10-07 14:13:05.709149183 +0000 UTC m=+1167.537074975" lastFinishedPulling="2025-10-07 14:13:06.784560358 +0000 UTC m=+1168.612486160" observedRunningTime="2025-10-07 14:13:09.46779666 +0000 UTC m=+1171.295722452" watchObservedRunningTime="2025-10-07 14:13:09.471358338 +0000 UTC m=+1171.299284130" Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.478948 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"a608ec5b-d960-4b6b-946b-c9a36a38810c","Type":"ContainerStarted","Data":"104456aa6bf6f58617ecb124f755829e0bc240f3713eda0f24ac039e06194c48"} Oct 07 14:13:09 crc kubenswrapper[4717]: I1007 14:13:09.754352 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.488122 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a04504c-84a6-4fdc-bd94-dd20f0928872","Type":"ContainerStarted","Data":"5be738e3363caa2bd0ac75ebfeb978e125bbcadaee9f37277e51286eb683897f"} Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.488144 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8a04504c-84a6-4fdc-bd94-dd20f0928872" containerName="cinder-api-log" containerID="cri-o://72a841aebd7b7eae1bd3668f4555f016e1b8cde51e81a7949aabb9a58a8ef12e" gracePeriod=30 Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.488176 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.488204 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8a04504c-84a6-4fdc-bd94-dd20f0928872" containerName="cinder-api" containerID="cri-o://5be738e3363caa2bd0ac75ebfeb978e125bbcadaee9f37277e51286eb683897f" gracePeriod=30 Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.505835 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c971bcc-76c9-43c4-9819-de5e25ed2a3f","Type":"ContainerStarted","Data":"aaee56a7fb9083ae7c3bd25eb470d4dd5396147c3d33941c7d0c0a9490530e3d"} Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.508207 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575df7c864-lbv94" event={"ID":"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8","Type":"ContainerStarted","Data":"740d795e0cb5f6de980dd0eabcf8cf566f0e42a9595d7a1b2b13c7fe407b3e83"} Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.510339 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"a608ec5b-d960-4b6b-946b-c9a36a38810c","Type":"ContainerStarted","Data":"586132d8ba4ee20901a707b885dca94c20b37ea13f570eaed63b4fd3a257e297"} Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.536359 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.536340493 podStartE2EDuration="6.536340493s" podCreationTimestamp="2025-10-07 14:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:10.515928804 +0000 UTC m=+1172.343854606" watchObservedRunningTime="2025-10-07 14:13:10.536340493 +0000 UTC m=+1172.364266285" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.549600 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=5.353681612 podStartE2EDuration="6.549582726s" podCreationTimestamp="2025-10-07 14:13:04 +0000 UTC" firstStartedPulling="2025-10-07 14:13:05.952878475 +0000 UTC m=+1167.780804257" lastFinishedPulling="2025-10-07 14:13:07.148779579 +0000 UTC m=+1168.976705371" observedRunningTime="2025-10-07 14:13:10.546506122 +0000 UTC m=+1172.374431914" watchObservedRunningTime="2025-10-07 14:13:10.549582726 +0000 UTC m=+1172.377508518" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.581734 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.6401734999999995 podStartE2EDuration="6.581706896s" podCreationTimestamp="2025-10-07 14:13:04 +0000 UTC" firstStartedPulling="2025-10-07 14:13:05.182341712 +0000 UTC m=+1167.010267504" lastFinishedPulling="2025-10-07 14:13:06.123875108 +0000 UTC m=+1167.951800900" observedRunningTime="2025-10-07 14:13:10.577254894 +0000 UTC m=+1172.405180686" watchObservedRunningTime="2025-10-07 14:13:10.581706896 +0000 UTC m=+1172.409632688" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.704577 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.827387 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-swift-storage-0\") pod \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.827488 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-sb\") pod \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.827624 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkfcc\" (UniqueName: \"kubernetes.io/projected/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-kube-api-access-mkfcc\") pod \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.827665 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-svc\") pod \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.827701 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-config\") pod \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.827725 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-nb\") pod \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\" (UID: \"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4\") " Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.845324 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-kube-api-access-mkfcc" (OuterVolumeSpecName: "kube-api-access-mkfcc") pod "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" (UID: "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4"). InnerVolumeSpecName "kube-api-access-mkfcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.865924 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" (UID: "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.916506 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-config" (OuterVolumeSpecName: "config") pod "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" (UID: "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.929398 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkfcc\" (UniqueName: \"kubernetes.io/projected/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-kube-api-access-mkfcc\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.929429 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.929439 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.936710 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" (UID: "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:10 crc kubenswrapper[4717]: I1007 14:13:10.941633 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f3bb08-dd18-42b6-ab66-c82b8f9966ff" path="/var/lib/kubelet/pods/04f3bb08-dd18-42b6-ab66-c82b8f9966ff/volumes" Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.007540 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" (UID: "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.013711 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" (UID: "e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.032225 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.032264 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.032274 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.539850 4717 generic.go:334] "Generic (PLEG): container finished" podID="8a04504c-84a6-4fdc-bd94-dd20f0928872" containerID="5be738e3363caa2bd0ac75ebfeb978e125bbcadaee9f37277e51286eb683897f" exitCode=0 Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.539888 4717 generic.go:334] "Generic (PLEG): container finished" podID="8a04504c-84a6-4fdc-bd94-dd20f0928872" containerID="72a841aebd7b7eae1bd3668f4555f016e1b8cde51e81a7949aabb9a58a8ef12e" exitCode=143 Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.539932 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a04504c-84a6-4fdc-bd94-dd20f0928872","Type":"ContainerDied","Data":"5be738e3363caa2bd0ac75ebfeb978e125bbcadaee9f37277e51286eb683897f"} Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.539959 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a04504c-84a6-4fdc-bd94-dd20f0928872","Type":"ContainerDied","Data":"72a841aebd7b7eae1bd3668f4555f016e1b8cde51e81a7949aabb9a58a8ef12e"} Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.550319 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" event={"ID":"e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4","Type":"ContainerDied","Data":"baeb2c40062f3d280c185b9a4b197d199a52223cd5320bf261bcef4dc6a65978"} Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.550409 4717 scope.go:117] "RemoveContainer" containerID="6f55bcd2e60d0022470edad864bc30b0c5534b18262a1896e83bce5f2cf58779" Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.550574 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-skt9t" Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.735103 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-skt9t"] Oct 07 14:13:11 crc kubenswrapper[4717]: I1007 14:13:11.751302 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-skt9t"] Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.084524 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.168729 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a04504c-84a6-4fdc-bd94-dd20f0928872-logs\") pod \"8a04504c-84a6-4fdc-bd94-dd20f0928872\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.168799 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data\") pod \"8a04504c-84a6-4fdc-bd94-dd20f0928872\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.168873 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-scripts\") pod \"8a04504c-84a6-4fdc-bd94-dd20f0928872\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.168932 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a04504c-84a6-4fdc-bd94-dd20f0928872-etc-machine-id\") pod \"8a04504c-84a6-4fdc-bd94-dd20f0928872\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.168979 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvmt9\" (UniqueName: \"kubernetes.io/projected/8a04504c-84a6-4fdc-bd94-dd20f0928872-kube-api-access-wvmt9\") pod \"8a04504c-84a6-4fdc-bd94-dd20f0928872\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.169051 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data-custom\") pod \"8a04504c-84a6-4fdc-bd94-dd20f0928872\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.169197 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-combined-ca-bundle\") pod \"8a04504c-84a6-4fdc-bd94-dd20f0928872\" (UID: \"8a04504c-84a6-4fdc-bd94-dd20f0928872\") " Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.174004 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a04504c-84a6-4fdc-bd94-dd20f0928872-logs" (OuterVolumeSpecName: "logs") pod "8a04504c-84a6-4fdc-bd94-dd20f0928872" (UID: "8a04504c-84a6-4fdc-bd94-dd20f0928872"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.174372 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-scripts" (OuterVolumeSpecName: "scripts") pod "8a04504c-84a6-4fdc-bd94-dd20f0928872" (UID: "8a04504c-84a6-4fdc-bd94-dd20f0928872"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.175241 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a04504c-84a6-4fdc-bd94-dd20f0928872-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8a04504c-84a6-4fdc-bd94-dd20f0928872" (UID: "8a04504c-84a6-4fdc-bd94-dd20f0928872"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.177110 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a04504c-84a6-4fdc-bd94-dd20f0928872-kube-api-access-wvmt9" (OuterVolumeSpecName: "kube-api-access-wvmt9") pod "8a04504c-84a6-4fdc-bd94-dd20f0928872" (UID: "8a04504c-84a6-4fdc-bd94-dd20f0928872"). InnerVolumeSpecName "kube-api-access-wvmt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.181843 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8a04504c-84a6-4fdc-bd94-dd20f0928872" (UID: "8a04504c-84a6-4fdc-bd94-dd20f0928872"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.205864 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a04504c-84a6-4fdc-bd94-dd20f0928872" (UID: "8a04504c-84a6-4fdc-bd94-dd20f0928872"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.254667 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-wtwxc"] Oct 07 14:13:12 crc kubenswrapper[4717]: W1007 14:13:12.254770 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26678e24_2754_454e_abdc_3add4caf4c81.slice/crio-d555c375a738d298a5a3812493b00490dc9e44026b11f6d424213b1096393e3c WatchSource:0}: Error finding container d555c375a738d298a5a3812493b00490dc9e44026b11f6d424213b1096393e3c: Status 404 returned error can't find the container with id d555c375a738d298a5a3812493b00490dc9e44026b11f6d424213b1096393e3c Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.275302 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.275361 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a04504c-84a6-4fdc-bd94-dd20f0928872-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.275372 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvmt9\" (UniqueName: \"kubernetes.io/projected/8a04504c-84a6-4fdc-bd94-dd20f0928872-kube-api-access-wvmt9\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.275381 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.275390 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.275398 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a04504c-84a6-4fdc-bd94-dd20f0928872-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.330972 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-59986d7f85-m2phf"] Oct 07 14:13:12 crc kubenswrapper[4717]: E1007 14:13:12.331562 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" containerName="init" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.331581 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" containerName="init" Oct 07 14:13:12 crc kubenswrapper[4717]: E1007 14:13:12.331630 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a04504c-84a6-4fdc-bd94-dd20f0928872" containerName="cinder-api-log" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.331638 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a04504c-84a6-4fdc-bd94-dd20f0928872" containerName="cinder-api-log" Oct 07 14:13:12 crc kubenswrapper[4717]: E1007 14:13:12.331655 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a04504c-84a6-4fdc-bd94-dd20f0928872" containerName="cinder-api" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.331662 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a04504c-84a6-4fdc-bd94-dd20f0928872" containerName="cinder-api" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.331893 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" containerName="init" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.331908 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a04504c-84a6-4fdc-bd94-dd20f0928872" containerName="cinder-api-log" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.331923 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a04504c-84a6-4fdc-bd94-dd20f0928872" containerName="cinder-api" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.333670 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.341593 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.341741 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.378080 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59986d7f85-m2phf"] Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.428753 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data" (OuterVolumeSpecName: "config-data") pod "8a04504c-84a6-4fdc-bd94-dd20f0928872" (UID: "8a04504c-84a6-4fdc-bd94-dd20f0928872"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.485673 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-combined-ca-bundle\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.487045 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-public-tls-certs\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.487534 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gssfl\" (UniqueName: \"kubernetes.io/projected/c8606e01-73db-4dce-92b0-89d47762aa09-kube-api-access-gssfl\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.487902 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-httpd-config\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.487958 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-ovndb-tls-certs\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.488095 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-internal-tls-certs\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.488221 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-config\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.488369 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a04504c-84a6-4fdc-bd94-dd20f0928872-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.495228 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b594ccc88-lfbdk"] Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.590202 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-public-tls-certs\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.590593 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gssfl\" (UniqueName: \"kubernetes.io/projected/c8606e01-73db-4dce-92b0-89d47762aa09-kube-api-access-gssfl\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.590627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-httpd-config\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.590664 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-ovndb-tls-certs\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.590701 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-internal-tls-certs\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.590768 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-config\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.590828 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-combined-ca-bundle\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.591455 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8474564558-fss6t" event={"ID":"237c55e8-afd6-4798-b0cf-7f8b20b0e323","Type":"ContainerStarted","Data":"c591883315cac3c85ac5bc521020cb8cc1edb5a1fa7c10f062919c2a62e2b8fc"} Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.591484 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-8474564558-fss6t" event={"ID":"237c55e8-afd6-4798-b0cf-7f8b20b0e323","Type":"ContainerStarted","Data":"ec06c80016a92c1e4d864a28b86e990eba305b8417e0e796ef499d8119587a7d"} Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.603219 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b594ccc88-lfbdk" event={"ID":"278a49b0-beb4-430b-ad36-065e5911394b","Type":"ContainerStarted","Data":"5ba0997fa009baec2d8316784fd7b2a375dd71fdb1ae430a78c79353036adade"} Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.604625 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-httpd-config\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.616735 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-public-tls-certs\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.620672 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-ovndb-tls-certs\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.625756 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-combined-ca-bundle\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.625995 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-internal-tls-certs\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.626272 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-8474564558-fss6t" podStartSLOduration=2.467082006 podStartE2EDuration="6.626250546s" podCreationTimestamp="2025-10-07 14:13:06 +0000 UTC" firstStartedPulling="2025-10-07 14:13:07.608403545 +0000 UTC m=+1169.436329337" lastFinishedPulling="2025-10-07 14:13:11.767572095 +0000 UTC m=+1173.595497877" observedRunningTime="2025-10-07 14:13:12.622263917 +0000 UTC m=+1174.450189709" watchObservedRunningTime="2025-10-07 14:13:12.626250546 +0000 UTC m=+1174.454176338" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.630379 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c8606e01-73db-4dce-92b0-89d47762aa09-config\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.630782 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575df7c864-lbv94" event={"ID":"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8","Type":"ContainerStarted","Data":"5d3625f2766e914b8f88be4116ee8067b151d5f04dec70cd325693a063cc7704"} Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.630835 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.630852 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.631660 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gssfl\" (UniqueName: \"kubernetes.io/projected/c8606e01-73db-4dce-92b0-89d47762aa09-kube-api-access-gssfl\") pod \"neutron-59986d7f85-m2phf\" (UID: \"c8606e01-73db-4dce-92b0-89d47762aa09\") " pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.652398 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-575df7c864-lbv94" podStartSLOduration=6.6523811120000005 podStartE2EDuration="6.652381112s" podCreationTimestamp="2025-10-07 14:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:12.651145809 +0000 UTC m=+1174.479071611" watchObservedRunningTime="2025-10-07 14:13:12.652381112 +0000 UTC m=+1174.480306904" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.658273 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" event={"ID":"26678e24-2754-454e-abdc-3add4caf4c81","Type":"ContainerStarted","Data":"d555c375a738d298a5a3812493b00490dc9e44026b11f6d424213b1096393e3c"} Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.669247 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8a04504c-84a6-4fdc-bd94-dd20f0928872","Type":"ContainerDied","Data":"78ec01c531b87c9fcbdd4d8ba7388c1a4e53ed9aa188618de1e0c8b98b7c6dc2"} Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.669297 4717 scope.go:117] "RemoveContainer" containerID="5be738e3363caa2bd0ac75ebfeb978e125bbcadaee9f37277e51286eb683897f" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.669386 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.696729 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.696763 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b648ffc67-ks8w7" event={"ID":"c96f41ab-26d3-44e4-8ad3-732b104a09df","Type":"ContainerStarted","Data":"b05369daad2fa6b40307a5b2911417bc8beee05cc8d468b4afb9a9816aac5f3c"} Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.696800 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b648ffc67-ks8w7" event={"ID":"c96f41ab-26d3-44e4-8ad3-732b104a09df","Type":"ContainerStarted","Data":"de13d6c56a46730383a48778cd00d02c7d24e2770a7840343a26ceefac9a4c85"} Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.726841 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-b648ffc67-ks8w7" podStartSLOduration=2.361687959 podStartE2EDuration="6.726822233s" podCreationTimestamp="2025-10-07 14:13:06 +0000 UTC" firstStartedPulling="2025-10-07 14:13:07.386062532 +0000 UTC m=+1169.213988324" lastFinishedPulling="2025-10-07 14:13:11.751196806 +0000 UTC m=+1173.579122598" observedRunningTime="2025-10-07 14:13:12.722132014 +0000 UTC m=+1174.550057806" watchObservedRunningTime="2025-10-07 14:13:12.726822233 +0000 UTC m=+1174.554748035" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.899914 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4" path="/var/lib/kubelet/pods/e8fe4c71-f7f4-4a56-b996-3cb8563dcbd4/volumes" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.988041 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.993608 4717 scope.go:117] "RemoveContainer" containerID="72a841aebd7b7eae1bd3668f4555f016e1b8cde51e81a7949aabb9a58a8ef12e" Oct 07 14:13:12 crc kubenswrapper[4717]: I1007 14:13:12.998566 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.024447 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.027493 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.035660 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.035881 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.047257 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.050428 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.102254 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-config-data\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.102315 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-scripts\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.102344 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.102375 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2ad858d-25ea-41d2-8499-188e72ca0873-logs\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.102414 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-config-data-custom\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.102443 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8l44\" (UniqueName: \"kubernetes.io/projected/c2ad858d-25ea-41d2-8499-188e72ca0873-kube-api-access-p8l44\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.102463 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2ad858d-25ea-41d2-8499-188e72ca0873-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.102483 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.102749 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.204379 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-config-data-custom\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.204707 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8l44\" (UniqueName: \"kubernetes.io/projected/c2ad858d-25ea-41d2-8499-188e72ca0873-kube-api-access-p8l44\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.204729 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2ad858d-25ea-41d2-8499-188e72ca0873-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.204750 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.204857 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c2ad858d-25ea-41d2-8499-188e72ca0873-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.205497 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.205557 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-config-data\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.205590 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-scripts\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.205605 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.205640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2ad858d-25ea-41d2-8499-188e72ca0873-logs\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.206028 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2ad858d-25ea-41d2-8499-188e72ca0873-logs\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.210990 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.212195 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.213614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.213807 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-config-data\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.213869 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-scripts\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.220090 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c2ad858d-25ea-41d2-8499-188e72ca0873-config-data-custom\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.250853 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8l44\" (UniqueName: \"kubernetes.io/projected/c2ad858d-25ea-41d2-8499-188e72ca0873-kube-api-access-p8l44\") pod \"cinder-api-0\" (UID: \"c2ad858d-25ea-41d2-8499-188e72ca0873\") " pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.371779 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-59986d7f85-m2phf"] Oct 07 14:13:13 crc kubenswrapper[4717]: W1007 14:13:13.373568 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8606e01_73db_4dce_92b0_89d47762aa09.slice/crio-cde1f376d4ce8d01c827b6f55b8c261b4a92024579840aca55b8dbc75fbf92be WatchSource:0}: Error finding container cde1f376d4ce8d01c827b6f55b8c261b4a92024579840aca55b8dbc75fbf92be: Status 404 returned error can't find the container with id cde1f376d4ce8d01c827b6f55b8c261b4a92024579840aca55b8dbc75fbf92be Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.380740 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.784102 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b594ccc88-lfbdk" event={"ID":"278a49b0-beb4-430b-ad36-065e5911394b","Type":"ContainerStarted","Data":"3bf00faee54a5d7ebf3ec00ac8a8e039f45668051403d10301659fac663a7a4b"} Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.784631 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b594ccc88-lfbdk" event={"ID":"278a49b0-beb4-430b-ad36-065e5911394b","Type":"ContainerStarted","Data":"d83a1c1d4c5ccc4c5f0d50a08e3c5f8730287a5b4c216cef68a918d5fcb2e5b1"} Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.785709 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59986d7f85-m2phf" event={"ID":"c8606e01-73db-4dce-92b0-89d47762aa09","Type":"ContainerStarted","Data":"cde1f376d4ce8d01c827b6f55b8c261b4a92024579840aca55b8dbc75fbf92be"} Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.788582 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.795308 4717 generic.go:334] "Generic (PLEG): container finished" podID="26678e24-2754-454e-abdc-3add4caf4c81" containerID="171d45b0003dbaab76e8a3832a18db738dd7399350976176a2dd79b1efa38fce" exitCode=0 Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.796216 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" event={"ID":"26678e24-2754-454e-abdc-3add4caf4c81","Type":"ContainerDied","Data":"171d45b0003dbaab76e8a3832a18db738dd7399350976176a2dd79b1efa38fce"} Oct 07 14:13:13 crc kubenswrapper[4717]: I1007 14:13:13.834507 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b594ccc88-lfbdk" podStartSLOduration=5.834484517 podStartE2EDuration="5.834484517s" podCreationTimestamp="2025-10-07 14:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:13.810077668 +0000 UTC m=+1175.638003500" watchObservedRunningTime="2025-10-07 14:13:13.834484517 +0000 UTC m=+1175.662410309" Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.197440 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 07 14:13:14 crc kubenswrapper[4717]: W1007 14:13:14.220177 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2ad858d_25ea_41d2_8499_188e72ca0873.slice/crio-3aab60f19f726223a97e0452da53e733c64955a2fdf2d3ae2b2203d9c164eae7 WatchSource:0}: Error finding container 3aab60f19f726223a97e0452da53e733c64955a2fdf2d3ae2b2203d9c164eae7: Status 404 returned error can't find the container with id 3aab60f19f726223a97e0452da53e733c64955a2fdf2d3ae2b2203d9c164eae7 Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.643538 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.813219 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" event={"ID":"26678e24-2754-454e-abdc-3add4caf4c81","Type":"ContainerStarted","Data":"0c7560b6c5c12fb1a17cabe6cf83cdb6a91e53a8e5675120e7d6f638ba5cecd3"} Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.813357 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.814793 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c2ad858d-25ea-41d2-8499-188e72ca0873","Type":"ContainerStarted","Data":"3aab60f19f726223a97e0452da53e733c64955a2fdf2d3ae2b2203d9c164eae7"} Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.819033 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59986d7f85-m2phf" event={"ID":"c8606e01-73db-4dce-92b0-89d47762aa09","Type":"ContainerStarted","Data":"a78442237976262889bca1f4f24dda831bc341f3f57d3b3bc75a2f27d8c7008e"} Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.819086 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-59986d7f85-m2phf" event={"ID":"c8606e01-73db-4dce-92b0-89d47762aa09","Type":"ContainerStarted","Data":"b88a88af19bde89b1dd855a3173cfb712808877834a05102744b629de7ebbce8"} Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.819289 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.842673 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" podStartSLOduration=6.842631875 podStartE2EDuration="6.842631875s" podCreationTimestamp="2025-10-07 14:13:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:14.841819152 +0000 UTC m=+1176.669744974" watchObservedRunningTime="2025-10-07 14:13:14.842631875 +0000 UTC m=+1176.670557667" Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.859517 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-59986d7f85-m2phf" podStartSLOduration=2.859499727 podStartE2EDuration="2.859499727s" podCreationTimestamp="2025-10-07 14:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:14.858498429 +0000 UTC m=+1176.686424221" watchObservedRunningTime="2025-10-07 14:13:14.859499727 +0000 UTC m=+1176.687425519" Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.882019 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a04504c-84a6-4fdc-bd94-dd20f0928872" path="/var/lib/kubelet/pods/8a04504c-84a6-4fdc-bd94-dd20f0928872/volumes" Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.891496 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.987772 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cbb74df64-qfwg8"] Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.990772 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.995517 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 07 14:13:14 crc kubenswrapper[4717]: I1007 14:13:14.995693 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.000462 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cbb74df64-qfwg8"] Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.173043 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-config-data\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.173234 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-public-tls-certs\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.173300 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-internal-tls-certs\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.173330 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qsqg\" (UniqueName: \"kubernetes.io/projected/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-kube-api-access-4qsqg\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.173477 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-combined-ca-bundle\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.173546 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-config-data-custom\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.173618 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-logs\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.213707 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.274766 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-config-data\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.274863 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-public-tls-certs\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.274891 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-internal-tls-certs\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.274915 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qsqg\" (UniqueName: \"kubernetes.io/projected/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-kube-api-access-4qsqg\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.274955 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-combined-ca-bundle\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.274981 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-config-data-custom\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.275043 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-logs\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.275512 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-logs\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.280277 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.283085 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-public-tls-certs\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.283627 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-config-data-custom\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.284506 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-config-data\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.284614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-combined-ca-bundle\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.287558 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-internal-tls-certs\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.293948 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qsqg\" (UniqueName: \"kubernetes.io/projected/2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb-kube-api-access-4qsqg\") pod \"barbican-api-7cbb74df64-qfwg8\" (UID: \"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb\") " pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.312324 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.323518 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.408591 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.428662 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-74d44865f4-vrndk" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.464402 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.833236 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c2ad858d-25ea-41d2-8499-188e72ca0873","Type":"ContainerStarted","Data":"f6eea3702cce7f8f4cf871e710a960ff23d75d1da51c0c873e6b13886c556807"} Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.833361 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="3451f8ea-dab2-464f-802d-90acccb50b4d" containerName="cinder-backup" containerID="cri-o://f935be5ff57334500488699d29ae60742d2c62839fcb9dd024d0433adf3a4179" gracePeriod=30 Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.834464 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-backup-0" podUID="3451f8ea-dab2-464f-802d-90acccb50b4d" containerName="probe" containerID="cri-o://e3f98de8706573b91e2b44cf79affc2f7cb03d21f73a9b7347599bb3c58da4f8" gracePeriod=30 Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.834739 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" containerName="cinder-scheduler" containerID="cri-o://a0e111b685bebd3cad91d5e8e0f922413b5a2d9352c054f0b49458c514a09169" gracePeriod=30 Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.834858 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" containerName="probe" containerID="cri-o://aaee56a7fb9083ae7c3bd25eb470d4dd5396147c3d33941c7d0c0a9490530e3d" gracePeriod=30 Oct 07 14:13:15 crc kubenswrapper[4717]: I1007 14:13:15.915332 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 14:13:16 crc kubenswrapper[4717]: I1007 14:13:16.179512 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cbb74df64-qfwg8"] Oct 07 14:13:16 crc kubenswrapper[4717]: I1007 14:13:16.844464 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbb74df64-qfwg8" event={"ID":"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb","Type":"ContainerStarted","Data":"3acd514ab1fd76597a51a7a2bf11116fad8219ea8469abf1710ec6b48bbdf18a"} Oct 07 14:13:16 crc kubenswrapper[4717]: I1007 14:13:16.845971 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbb74df64-qfwg8" event={"ID":"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb","Type":"ContainerStarted","Data":"790d82e8c77a519a843c142ac6f48e52710add7aec3927eccfafd1c22f9db54b"} Oct 07 14:13:16 crc kubenswrapper[4717]: I1007 14:13:16.853589 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="a608ec5b-d960-4b6b-946b-c9a36a38810c" containerName="cinder-volume" containerID="cri-o://104456aa6bf6f58617ecb124f755829e0bc240f3713eda0f24ac039e06194c48" gracePeriod=30 Oct 07 14:13:16 crc kubenswrapper[4717]: I1007 14:13:16.853764 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-volume-volume1-0" podUID="a608ec5b-d960-4b6b-946b-c9a36a38810c" containerName="probe" containerID="cri-o://586132d8ba4ee20901a707b885dca94c20b37ea13f570eaed63b4fd3a257e297" gracePeriod=30 Oct 07 14:13:16 crc kubenswrapper[4717]: I1007 14:13:16.853897 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c2ad858d-25ea-41d2-8499-188e72ca0873","Type":"ContainerStarted","Data":"aeff446ee02b7a401de47e5540a7ff9dc1e4181df78c7e6cc6557e1a429bf4ec"} Oct 07 14:13:16 crc kubenswrapper[4717]: I1007 14:13:16.854255 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 07 14:13:16 crc kubenswrapper[4717]: I1007 14:13:16.876832 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.87681211 podStartE2EDuration="4.87681211s" podCreationTimestamp="2025-10-07 14:13:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:16.874377733 +0000 UTC m=+1178.702303565" watchObservedRunningTime="2025-10-07 14:13:16.87681211 +0000 UTC m=+1178.704737892" Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.005175 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.307865 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5848b6684d-cddjh" Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.884938 4717 generic.go:334] "Generic (PLEG): container finished" podID="3451f8ea-dab2-464f-802d-90acccb50b4d" containerID="e3f98de8706573b91e2b44cf79affc2f7cb03d21f73a9b7347599bb3c58da4f8" exitCode=0 Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.888070 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3451f8ea-dab2-464f-802d-90acccb50b4d","Type":"ContainerDied","Data":"e3f98de8706573b91e2b44cf79affc2f7cb03d21f73a9b7347599bb3c58da4f8"} Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.893439 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cbb74df64-qfwg8" event={"ID":"2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb","Type":"ContainerStarted","Data":"b7984f670823c34b5c9476e1953562dfe338c31b4b81fa464384e8c24d88e538"} Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.895881 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.895928 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.898913 4717 generic.go:334] "Generic (PLEG): container finished" podID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" containerID="aaee56a7fb9083ae7c3bd25eb470d4dd5396147c3d33941c7d0c0a9490530e3d" exitCode=0 Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.898959 4717 generic.go:334] "Generic (PLEG): container finished" podID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" containerID="a0e111b685bebd3cad91d5e8e0f922413b5a2d9352c054f0b49458c514a09169" exitCode=0 Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.899675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c971bcc-76c9-43c4-9819-de5e25ed2a3f","Type":"ContainerDied","Data":"aaee56a7fb9083ae7c3bd25eb470d4dd5396147c3d33941c7d0c0a9490530e3d"} Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.899717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c971bcc-76c9-43c4-9819-de5e25ed2a3f","Type":"ContainerDied","Data":"a0e111b685bebd3cad91d5e8e0f922413b5a2d9352c054f0b49458c514a09169"} Oct 07 14:13:17 crc kubenswrapper[4717]: I1007 14:13:17.944476 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cbb74df64-qfwg8" podStartSLOduration=3.944446128 podStartE2EDuration="3.944446128s" podCreationTimestamp="2025-10-07 14:13:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:17.922479046 +0000 UTC m=+1179.750404838" watchObservedRunningTime="2025-10-07 14:13:17.944446128 +0000 UTC m=+1179.772371920" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.179246 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.282947 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data-custom\") pod \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.283106 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-combined-ca-bundle\") pod \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.283131 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data\") pod \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.284189 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drnbk\" (UniqueName: \"kubernetes.io/projected/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-kube-api-access-drnbk\") pod \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.284515 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-scripts\") pod \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.284639 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-etc-machine-id\") pod \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\" (UID: \"4c971bcc-76c9-43c4-9819-de5e25ed2a3f\") " Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.285138 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4c971bcc-76c9-43c4-9819-de5e25ed2a3f" (UID: "4c971bcc-76c9-43c4-9819-de5e25ed2a3f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.297213 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4c971bcc-76c9-43c4-9819-de5e25ed2a3f" (UID: "4c971bcc-76c9-43c4-9819-de5e25ed2a3f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.313976 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-kube-api-access-drnbk" (OuterVolumeSpecName: "kube-api-access-drnbk") pod "4c971bcc-76c9-43c4-9819-de5e25ed2a3f" (UID: "4c971bcc-76c9-43c4-9819-de5e25ed2a3f"). InnerVolumeSpecName "kube-api-access-drnbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.321941 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-scripts" (OuterVolumeSpecName: "scripts") pod "4c971bcc-76c9-43c4-9819-de5e25ed2a3f" (UID: "4c971bcc-76c9-43c4-9819-de5e25ed2a3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.355943 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c971bcc-76c9-43c4-9819-de5e25ed2a3f" (UID: "4c971bcc-76c9-43c4-9819-de5e25ed2a3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.386122 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.386160 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.386172 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.386184 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drnbk\" (UniqueName: \"kubernetes.io/projected/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-kube-api-access-drnbk\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.386196 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.418200 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data" (OuterVolumeSpecName: "config-data") pod "4c971bcc-76c9-43c4-9819-de5e25ed2a3f" (UID: "4c971bcc-76c9-43c4-9819-de5e25ed2a3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.488133 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c971bcc-76c9-43c4-9819-de5e25ed2a3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.636727 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.771875 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 07 14:13:18 crc kubenswrapper[4717]: E1007 14:13:18.772271 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" containerName="cinder-scheduler" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.772288 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" containerName="cinder-scheduler" Oct 07 14:13:18 crc kubenswrapper[4717]: E1007 14:13:18.772320 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" containerName="probe" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.772330 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" containerName="probe" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.772512 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" containerName="probe" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.772531 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" containerName="cinder-scheduler" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.773106 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.776674 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.776897 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.777533 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-ltsbz" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.825150 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.901165 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-openstack-config-secret\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.901594 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-openstack-config\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.901633 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rspp\" (UniqueName: \"kubernetes.io/projected/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-kube-api-access-5rspp\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.901648 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.914475 4717 generic.go:334] "Generic (PLEG): container finished" podID="a608ec5b-d960-4b6b-946b-c9a36a38810c" containerID="586132d8ba4ee20901a707b885dca94c20b37ea13f570eaed63b4fd3a257e297" exitCode=0 Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.914812 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"a608ec5b-d960-4b6b-946b-c9a36a38810c","Type":"ContainerDied","Data":"586132d8ba4ee20901a707b885dca94c20b37ea13f570eaed63b4fd3a257e297"} Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.917317 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c971bcc-76c9-43c4-9819-de5e25ed2a3f","Type":"ContainerDied","Data":"da4ca233fe96ad22f77aa2a601fb04b7a38440f70e4d3bcaf5ad4ad4ab8ad081"} Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.917382 4717 scope.go:117] "RemoveContainer" containerID="aaee56a7fb9083ae7c3bd25eb470d4dd5396147c3d33941c7d0c0a9490530e3d" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.917567 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.925824 4717 generic.go:334] "Generic (PLEG): container finished" podID="3451f8ea-dab2-464f-802d-90acccb50b4d" containerID="f935be5ff57334500488699d29ae60742d2c62839fcb9dd024d0433adf3a4179" exitCode=0 Oct 07 14:13:18 crc kubenswrapper[4717]: I1007 14:13:18.926484 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3451f8ea-dab2-464f-802d-90acccb50b4d","Type":"ContainerDied","Data":"f935be5ff57334500488699d29ae60742d2c62839fcb9dd024d0433adf3a4179"} Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:18.988247 4717 scope.go:117] "RemoveContainer" containerID="a0e111b685bebd3cad91d5e8e0f922413b5a2d9352c054f0b49458c514a09169" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.000872 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.003096 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-openstack-config\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.003148 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rspp\" (UniqueName: \"kubernetes.io/projected/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-kube-api-access-5rspp\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.003176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.003254 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-openstack-config-secret\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.011561 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.014031 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-openstack-config\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.036161 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.044637 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rspp\" (UniqueName: \"kubernetes.io/projected/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-kube-api-access-5rspp\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.044731 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.047978 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.052038 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.057758 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.081467 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2-openstack-config-secret\") pod \"openstackclient\" (UID: \"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2\") " pod="openstack/openstackclient" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.086830 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.208255 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.208358 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-config-data\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.208383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-scripts\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.208411 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.208499 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbz9n\" (UniqueName: \"kubernetes.io/projected/348f889a-8a84-4e54-81cf-46ee269e85d9-kube-api-access-rbz9n\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.208563 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/348f889a-8a84-4e54-81cf-46ee269e85d9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.310714 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbz9n\" (UniqueName: \"kubernetes.io/projected/348f889a-8a84-4e54-81cf-46ee269e85d9-kube-api-access-rbz9n\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.310806 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/348f889a-8a84-4e54-81cf-46ee269e85d9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.310849 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.310952 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-config-data\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.310970 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-scripts\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.310989 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.317263 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/348f889a-8a84-4e54-81cf-46ee269e85d9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.327383 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-config-data\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.327810 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.329805 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-scripts\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.334275 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/348f889a-8a84-4e54-81cf-46ee269e85d9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.341118 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbz9n\" (UniqueName: \"kubernetes.io/projected/348f889a-8a84-4e54-81cf-46ee269e85d9-kube-api-access-rbz9n\") pod \"cinder-scheduler-0\" (UID: \"348f889a-8a84-4e54-81cf-46ee269e85d9\") " pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.367671 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.420433 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.443465 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.480521 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-svjc9"] Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.480747 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" podUID="1396fa07-0a19-481d-ae2b-f943755ea2ad" containerName="dnsmasq-dns" containerID="cri-o://42c716a25b3cc555eca309b639df43746cbd0c5c1d2f7c2dee42b6a1b9a64e60" gracePeriod=10 Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.529868 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-scripts\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.529936 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-run\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.529968 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530046 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-combined-ca-bundle\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530104 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-dev\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530127 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-machine-id\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530185 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd5rs\" (UniqueName: \"kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-kube-api-access-zd5rs\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530203 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-lib-cinder\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530222 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data-custom\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530251 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-sys\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530270 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-brick\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530301 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-nvme\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530324 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-cinder\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530349 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-ceph\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530427 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-iscsi\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.530452 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-lib-modules\") pod \"3451f8ea-dab2-464f-802d-90acccb50b4d\" (UID: \"3451f8ea-dab2-464f-802d-90acccb50b4d\") " Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.536375 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.536425 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-run" (OuterVolumeSpecName: "run") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.536922 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.537135 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.537157 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-dev" (OuterVolumeSpecName: "dev") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.537169 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.537200 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.542156 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.542219 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.543130 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-sys" (OuterVolumeSpecName: "sys") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.548296 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.550448 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-kube-api-access-zd5rs" (OuterVolumeSpecName: "kube-api-access-zd5rs") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "kube-api-access-zd5rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.550535 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-ceph" (OuterVolumeSpecName: "ceph") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.550869 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-scripts" (OuterVolumeSpecName: "scripts") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639441 4717 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-sys\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639490 4717 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639504 4717 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639514 4717 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639526 4717 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639536 4717 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639550 4717 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639559 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639570 4717 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639579 4717 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-dev\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639588 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639596 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd5rs\" (UniqueName: \"kubernetes.io/projected/3451f8ea-dab2-464f-802d-90acccb50b4d-kube-api-access-zd5rs\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639606 4717 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3451f8ea-dab2-464f-802d-90acccb50b4d-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.639617 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.810136 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.843107 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.865467 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.947721 4717 generic.go:334] "Generic (PLEG): container finished" podID="a608ec5b-d960-4b6b-946b-c9a36a38810c" containerID="104456aa6bf6f58617ecb124f755829e0bc240f3713eda0f24ac039e06194c48" exitCode=0 Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.948028 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"a608ec5b-d960-4b6b-946b-c9a36a38810c","Type":"ContainerDied","Data":"104456aa6bf6f58617ecb124f755829e0bc240f3713eda0f24ac039e06194c48"} Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.948085 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"a608ec5b-d960-4b6b-946b-c9a36a38810c","Type":"ContainerDied","Data":"e2ab7ccf24204838aa8b0a67b76effa6005eb1af8be40093c68ab39347f3ee9d"} Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.948104 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2ab7ccf24204838aa8b0a67b76effa6005eb1af8be40093c68ab39347f3ee9d" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.953275 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data" (OuterVolumeSpecName: "config-data") pod "3451f8ea-dab2-464f-802d-90acccb50b4d" (UID: "3451f8ea-dab2-464f-802d-90acccb50b4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.986308 4717 generic.go:334] "Generic (PLEG): container finished" podID="1396fa07-0a19-481d-ae2b-f943755ea2ad" containerID="42c716a25b3cc555eca309b639df43746cbd0c5c1d2f7c2dee42b6a1b9a64e60" exitCode=0 Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.986365 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" event={"ID":"1396fa07-0a19-481d-ae2b-f943755ea2ad","Type":"ContainerDied","Data":"42c716a25b3cc555eca309b639df43746cbd0c5c1d2f7c2dee42b6a1b9a64e60"} Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.994601 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.996281 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.996278 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3451f8ea-dab2-464f-802d-90acccb50b4d","Type":"ContainerDied","Data":"3e9b0d73847caf4dacf73739b540618379af9787c79c82aeafa8dcc81befef58"} Oct 07 14:13:19 crc kubenswrapper[4717]: I1007 14:13:19.996533 4717 scope.go:117] "RemoveContainer" containerID="e3f98de8706573b91e2b44cf79affc2f7cb03d21f73a9b7347599bb3c58da4f8" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:19.998236 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2","Type":"ContainerStarted","Data":"4b60015b6f4c0bf2eb37953b0b8475eefca74c2382be06ad2185515d6739db9a"} Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.050095 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3451f8ea-dab2-464f-802d-90acccb50b4d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.063624 4717 scope.go:117] "RemoveContainer" containerID="f935be5ff57334500488699d29ae60742d2c62839fcb9dd024d0433adf3a4179" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.063695 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.084363 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.098171 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:13:20 crc kubenswrapper[4717]: E1007 14:13:20.102002 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a608ec5b-d960-4b6b-946b-c9a36a38810c" containerName="cinder-volume" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.102070 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a608ec5b-d960-4b6b-946b-c9a36a38810c" containerName="cinder-volume" Oct 07 14:13:20 crc kubenswrapper[4717]: E1007 14:13:20.102112 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a608ec5b-d960-4b6b-946b-c9a36a38810c" containerName="probe" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.102123 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a608ec5b-d960-4b6b-946b-c9a36a38810c" containerName="probe" Oct 07 14:13:20 crc kubenswrapper[4717]: E1007 14:13:20.102144 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3451f8ea-dab2-464f-802d-90acccb50b4d" containerName="cinder-backup" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.102150 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3451f8ea-dab2-464f-802d-90acccb50b4d" containerName="cinder-backup" Oct 07 14:13:20 crc kubenswrapper[4717]: E1007 14:13:20.102192 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3451f8ea-dab2-464f-802d-90acccb50b4d" containerName="probe" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.102200 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3451f8ea-dab2-464f-802d-90acccb50b4d" containerName="probe" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.102754 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3451f8ea-dab2-464f-802d-90acccb50b4d" containerName="probe" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.102798 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a608ec5b-d960-4b6b-946b-c9a36a38810c" containerName="probe" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.102827 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a608ec5b-d960-4b6b-946b-c9a36a38810c" containerName="cinder-volume" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.102845 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3451f8ea-dab2-464f-802d-90acccb50b4d" containerName="cinder-backup" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.110117 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.112966 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.159239 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-iscsi\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.161273 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-scripts\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.161392 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-sys\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.161491 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-combined-ca-bundle\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.161598 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-cinder\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.162598 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-machine-id\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.162738 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-brick\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.162852 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7hq4\" (UniqueName: \"kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-kube-api-access-c7hq4\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.162958 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.163080 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-lib-modules\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.163203 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-ceph\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.163288 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-lib-cinder\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.163411 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data-custom\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.163503 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-run\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.163569 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-dev\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.163643 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-nvme\") pod \"a608ec5b-d960-4b6b-946b-c9a36a38810c\" (UID: \"a608ec5b-d960-4b6b-946b-c9a36a38810c\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.160616 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.162596 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.162627 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-sys" (OuterVolumeSpecName: "sys") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.165157 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-dev" (OuterVolumeSpecName: "dev") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.165216 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.165695 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.167228 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.167270 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.167292 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.169027 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-run" (OuterVolumeSpecName: "run") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.171518 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-kube-api-access-c7hq4" (OuterVolumeSpecName: "kube-api-access-c7hq4") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "kube-api-access-c7hq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.174107 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.177704 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.183480 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-ceph" (OuterVolumeSpecName: "ceph") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.188381 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-scripts" (OuterVolumeSpecName: "scripts") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.234891 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266256 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-config-data\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266297 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266315 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266356 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266375 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266389 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-dev\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266410 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-scripts\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266426 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-run\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266445 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-ceph\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266717 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-lib-modules\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266769 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266790 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-sys\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266822 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266848 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266867 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmtsp\" (UniqueName: \"kubernetes.io/projected/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-kube-api-access-kmtsp\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266887 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266950 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266963 4717 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-cinder\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266972 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266981 4717 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.266993 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7hq4\" (UniqueName: \"kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-kube-api-access-c7hq4\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.267028 4717 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-lib-modules\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.267041 4717 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a608ec5b-d960-4b6b-946b-c9a36a38810c-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.267053 4717 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-var-lib-cinder\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.267065 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.267078 4717 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.267088 4717 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-dev\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.267098 4717 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-nvme\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.267106 4717 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.267114 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.267124 4717 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a608ec5b-d960-4b6b-946b-c9a36a38810c-sys\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.272919 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.348638 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data" (OuterVolumeSpecName: "config-data") pod "a608ec5b-d960-4b6b-946b-c9a36a38810c" (UID: "a608ec5b-d960-4b6b-946b-c9a36a38810c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.371929 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372061 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-sys\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372091 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372113 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372132 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmtsp\" (UniqueName: \"kubernetes.io/projected/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-kube-api-access-kmtsp\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372150 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372180 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-config-data\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372195 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372363 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372437 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372510 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372527 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-dev\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372547 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-scripts\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-run\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372608 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-ceph\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372665 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-lib-modules\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372737 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a608ec5b-d960-4b6b-946b-c9a36a38810c-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372784 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-lib-modules\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372838 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372860 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-sys\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372948 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.372994 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.375133 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.375221 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.377135 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.377204 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-dev\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.377229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-run\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.380089 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.381214 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-config-data\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.383483 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.384402 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-ceph\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.390432 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-scripts\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.393430 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmtsp\" (UniqueName: \"kubernetes.io/projected/d6be3c5a-b5c5-49a0-ae43-f90c44cf6496-kube-api-access-kmtsp\") pod \"cinder-backup-0\" (UID: \"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496\") " pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.457593 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.673158 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.779193 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-svc\") pod \"1396fa07-0a19-481d-ae2b-f943755ea2ad\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.779263 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-nb\") pod \"1396fa07-0a19-481d-ae2b-f943755ea2ad\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.779322 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppt87\" (UniqueName: \"kubernetes.io/projected/1396fa07-0a19-481d-ae2b-f943755ea2ad-kube-api-access-ppt87\") pod \"1396fa07-0a19-481d-ae2b-f943755ea2ad\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.779602 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-sb\") pod \"1396fa07-0a19-481d-ae2b-f943755ea2ad\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.779645 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-swift-storage-0\") pod \"1396fa07-0a19-481d-ae2b-f943755ea2ad\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.779724 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-config\") pod \"1396fa07-0a19-481d-ae2b-f943755ea2ad\" (UID: \"1396fa07-0a19-481d-ae2b-f943755ea2ad\") " Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.802465 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1396fa07-0a19-481d-ae2b-f943755ea2ad-kube-api-access-ppt87" (OuterVolumeSpecName: "kube-api-access-ppt87") pod "1396fa07-0a19-481d-ae2b-f943755ea2ad" (UID: "1396fa07-0a19-481d-ae2b-f943755ea2ad"). InnerVolumeSpecName "kube-api-access-ppt87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.831218 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1396fa07-0a19-481d-ae2b-f943755ea2ad" (UID: "1396fa07-0a19-481d-ae2b-f943755ea2ad"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.838388 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1396fa07-0a19-481d-ae2b-f943755ea2ad" (UID: "1396fa07-0a19-481d-ae2b-f943755ea2ad"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.838701 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1396fa07-0a19-481d-ae2b-f943755ea2ad" (UID: "1396fa07-0a19-481d-ae2b-f943755ea2ad"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.863569 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1396fa07-0a19-481d-ae2b-f943755ea2ad" (UID: "1396fa07-0a19-481d-ae2b-f943755ea2ad"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.875103 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-config" (OuterVolumeSpecName: "config") pod "1396fa07-0a19-481d-ae2b-f943755ea2ad" (UID: "1396fa07-0a19-481d-ae2b-f943755ea2ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.891659 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.893402 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.893417 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.893428 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.893437 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1396fa07-0a19-481d-ae2b-f943755ea2ad-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.893448 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppt87\" (UniqueName: \"kubernetes.io/projected/1396fa07-0a19-481d-ae2b-f943755ea2ad-kube-api-access-ppt87\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.896693 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3451f8ea-dab2-464f-802d-90acccb50b4d" path="/var/lib/kubelet/pods/3451f8ea-dab2-464f-802d-90acccb50b4d/volumes" Oct 07 14:13:20 crc kubenswrapper[4717]: I1007 14:13:20.898427 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c971bcc-76c9-43c4-9819-de5e25ed2a3f" path="/var/lib/kubelet/pods/4c971bcc-76c9-43c4-9819-de5e25ed2a3f/volumes" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.041862 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" event={"ID":"1396fa07-0a19-481d-ae2b-f943755ea2ad","Type":"ContainerDied","Data":"5532342dc50cb1a4fd174e1ca67b50cbb9464b4138e1445899ca4670d3ac7489"} Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.041916 4717 scope.go:117] "RemoveContainer" containerID="42c716a25b3cc555eca309b639df43746cbd0c5c1d2f7c2dee42b6a1b9a64e60" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.041993 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-svjc9" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.049085 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.049478 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"348f889a-8a84-4e54-81cf-46ee269e85d9","Type":"ContainerStarted","Data":"40fadd8ea9eb78c1f78dd3a988fa5f5a780c3462fae330aa614c002348f66d5d"} Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.072067 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-svjc9"] Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.084892 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-svjc9"] Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.093350 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.108247 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.111814 4717 scope.go:117] "RemoveContainer" containerID="0c4aa575a987e95950b9dc33620acd3e9da9eef5ffc4eeb6b40aacfefdc85574" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.121318 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 14:13:21 crc kubenswrapper[4717]: E1007 14:13:21.121718 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1396fa07-0a19-481d-ae2b-f943755ea2ad" containerName="init" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.121734 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1396fa07-0a19-481d-ae2b-f943755ea2ad" containerName="init" Oct 07 14:13:21 crc kubenswrapper[4717]: E1007 14:13:21.121748 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1396fa07-0a19-481d-ae2b-f943755ea2ad" containerName="dnsmasq-dns" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.121754 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1396fa07-0a19-481d-ae2b-f943755ea2ad" containerName="dnsmasq-dns" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.121969 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1396fa07-0a19-481d-ae2b-f943755ea2ad" containerName="dnsmasq-dns" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.122929 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.126077 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.133562 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.143506 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.307883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.308265 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-dev\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.308304 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.308331 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.308375 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.308398 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.308819 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.308976 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/315f03ca-ba00-4899-b836-72bc9a1970eb-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.309079 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-run\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.309141 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.309233 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtjg4\" (UniqueName: \"kubernetes.io/projected/315f03ca-ba00-4899-b836-72bc9a1970eb-kube-api-access-qtjg4\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.309536 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.309588 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.309653 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.309668 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-sys\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.310146 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.414512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-sys\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.414603 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.414699 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-sys\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.414774 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.414907 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.414934 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-dev\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.414959 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.414979 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415037 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415046 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-dev\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415028 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415112 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415141 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415163 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415192 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415221 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/315f03ca-ba00-4899-b836-72bc9a1970eb-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415241 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-run\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415265 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415269 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415242 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415308 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtjg4\" (UniqueName: \"kubernetes.io/projected/315f03ca-ba00-4899-b836-72bc9a1970eb-kube-api-access-qtjg4\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415310 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-run\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415405 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415552 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415609 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.415719 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/315f03ca-ba00-4899-b836-72bc9a1970eb-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.418550 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.420639 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.430162 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.430779 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/315f03ca-ba00-4899-b836-72bc9a1970eb-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.430740 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/315f03ca-ba00-4899-b836-72bc9a1970eb-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.437749 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtjg4\" (UniqueName: \"kubernetes.io/projected/315f03ca-ba00-4899-b836-72bc9a1970eb-kube-api-access-qtjg4\") pod \"cinder-volume-volume1-0\" (UID: \"315f03ca-ba00-4899-b836-72bc9a1970eb\") " pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.457871 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:21 crc kubenswrapper[4717]: I1007 14:13:21.484765 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:22.075160 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496","Type":"ContainerStarted","Data":"03aaa158381413e06eb3f6ca2aa483ed2a5095215a2c0dbc4d7c2d5613d2c53f"} Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:22.075794 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496","Type":"ContainerStarted","Data":"d08e4a0bb13d96e7f8acb41ba958f971db3817612c8a28fe32b56e108c4c8bce"} Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:22.081839 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"348f889a-8a84-4e54-81cf-46ee269e85d9","Type":"ContainerStarted","Data":"781c751d2f43069d096cb1f54e5f988d7d4bb8a62a903f9dbbe4fa1439095ebc"} Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:22.173480 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 07 14:13:26 crc kubenswrapper[4717]: W1007 14:13:22.179882 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod315f03ca_ba00_4899_b836_72bc9a1970eb.slice/crio-56f43d490f20343b6424190bde5cfba81e848512acec1fac0ca478187b84f447 WatchSource:0}: Error finding container 56f43d490f20343b6424190bde5cfba81e848512acec1fac0ca478187b84f447: Status 404 returned error can't find the container with id 56f43d490f20343b6424190bde5cfba81e848512acec1fac0ca478187b84f447 Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:22.887641 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1396fa07-0a19-481d-ae2b-f943755ea2ad" path="/var/lib/kubelet/pods/1396fa07-0a19-481d-ae2b-f943755ea2ad/volumes" Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:22.888674 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a608ec5b-d960-4b6b-946b-c9a36a38810c" path="/var/lib/kubelet/pods/a608ec5b-d960-4b6b-946b-c9a36a38810c/volumes" Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:23.101064 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"315f03ca-ba00-4899-b836-72bc9a1970eb","Type":"ContainerStarted","Data":"56f43d490f20343b6424190bde5cfba81e848512acec1fac0ca478187b84f447"} Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:23.298611 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:24.119629 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"315f03ca-ba00-4899-b836-72bc9a1970eb","Type":"ContainerStarted","Data":"fe22b20751eecd8fd827d4c104ad8a1a30d6184aa1929ea9b12d9195ff1c13cd"} Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:24.125854 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"348f889a-8a84-4e54-81cf-46ee269e85d9","Type":"ContainerStarted","Data":"4a707cee43466a67b2e62fa14166c02b6af166bc0af8002b2e2fa1fd24d5eb0d"} Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:24.444074 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:25.153184 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d6be3c5a-b5c5-49a0-ae43-f90c44cf6496","Type":"ContainerStarted","Data":"169b5c72481a41d0a060e27a0bdfb86b1a75fc721531ae05d8f8f446ec4446aa"} Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:25.157397 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"315f03ca-ba00-4899-b836-72bc9a1970eb","Type":"ContainerStarted","Data":"63618c7067f4b485175fb13829b184f1312e7c7026763a0de96cff56e6b64a80"} Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:25.188554 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=5.188517797 podStartE2EDuration="5.188517797s" podCreationTimestamp="2025-10-07 14:13:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:25.17219434 +0000 UTC m=+1187.000120132" watchObservedRunningTime="2025-10-07 14:13:25.188517797 +0000 UTC m=+1187.016443589" Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:25.189210 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.189202796 podStartE2EDuration="7.189202796s" podCreationTimestamp="2025-10-07 14:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:24.146861821 +0000 UTC m=+1185.974787613" watchObservedRunningTime="2025-10-07 14:13:25.189202796 +0000 UTC m=+1187.017128588" Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:25.458455 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:25.907433 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:26.223210 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=5.223089299 podStartE2EDuration="5.223089299s" podCreationTimestamp="2025-10-07 14:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:26.210825783 +0000 UTC m=+1188.038751575" watchObservedRunningTime="2025-10-07 14:13:26.223089299 +0000 UTC m=+1188.051015091" Oct 07 14:13:26 crc kubenswrapper[4717]: I1007 14:13:26.485888 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:27 crc kubenswrapper[4717]: I1007 14:13:27.697205 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:27 crc kubenswrapper[4717]: I1007 14:13:27.740861 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cbb74df64-qfwg8" Oct 07 14:13:27 crc kubenswrapper[4717]: I1007 14:13:27.815160 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-575df7c864-lbv94"] Oct 07 14:13:27 crc kubenswrapper[4717]: I1007 14:13:27.815916 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-575df7c864-lbv94" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerName="barbican-api" containerID="cri-o://5d3625f2766e914b8f88be4116ee8067b151d5f04dec70cd325693a063cc7704" gracePeriod=30 Oct 07 14:13:27 crc kubenswrapper[4717]: I1007 14:13:27.815480 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-575df7c864-lbv94" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerName="barbican-api-log" containerID="cri-o://740d795e0cb5f6de980dd0eabcf8cf566f0e42a9595d7a1b2b13c7fe407b3e83" gracePeriod=30 Oct 07 14:13:28 crc kubenswrapper[4717]: I1007 14:13:28.226318 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575df7c864-lbv94" event={"ID":"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8","Type":"ContainerDied","Data":"740d795e0cb5f6de980dd0eabcf8cf566f0e42a9595d7a1b2b13c7fe407b3e83"} Oct 07 14:13:28 crc kubenswrapper[4717]: I1007 14:13:28.226182 4717 generic.go:334] "Generic (PLEG): container finished" podID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerID="740d795e0cb5f6de980dd0eabcf8cf566f0e42a9595d7a1b2b13c7fe407b3e83" exitCode=143 Oct 07 14:13:29 crc kubenswrapper[4717]: I1007 14:13:29.670608 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.042064 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-9896cd659-vvdxn"] Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.043793 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.052647 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.052842 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.053211 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9896cd659-vvdxn"] Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.053572 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.128754 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-442lc\" (UniqueName: \"kubernetes.io/projected/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-kube-api-access-442lc\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.128809 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-run-httpd\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.128869 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-internal-tls-certs\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.128927 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-combined-ca-bundle\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.128967 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-config-data\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.128998 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-log-httpd\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.129028 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-public-tls-certs\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.129076 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-etc-swift\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.231110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-442lc\" (UniqueName: \"kubernetes.io/projected/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-kube-api-access-442lc\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.231412 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-run-httpd\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.231481 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-internal-tls-certs\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.231507 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-combined-ca-bundle\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.231546 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-config-data\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.231561 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-log-httpd\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.231579 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-public-tls-certs\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.231646 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-etc-swift\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.231818 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-run-httpd\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.232267 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-log-httpd\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.246808 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-etc-swift\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.247232 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-public-tls-certs\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.247456 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-combined-ca-bundle\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.251845 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-442lc\" (UniqueName: \"kubernetes.io/projected/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-kube-api-access-442lc\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.252059 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-config-data\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.256249 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f44baabf-f1c4-4036-8c6b-ce32cc6cf541-internal-tls-certs\") pod \"swift-proxy-9896cd659-vvdxn\" (UID: \"f44baabf-f1c4-4036-8c6b-ce32cc6cf541\") " pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.394124 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:30 crc kubenswrapper[4717]: I1007 14:13:30.770639 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.277689 4717 generic.go:334] "Generic (PLEG): container finished" podID="1703fd90-e328-4f67-850f-38f8663dd2c2" containerID="b93e480bbb5018f3ccdd3bd336d0576e5c6b868c33875b4ba82588ae48cff50f" exitCode=0 Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.277884 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fdlxw" event={"ID":"1703fd90-e328-4f67-850f-38f8663dd2c2","Type":"ContainerDied","Data":"b93e480bbb5018f3ccdd3bd336d0576e5c6b868c33875b4ba82588ae48cff50f"} Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.282630 4717 generic.go:334] "Generic (PLEG): container finished" podID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerID="5d3625f2766e914b8f88be4116ee8067b151d5f04dec70cd325693a063cc7704" exitCode=0 Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.282682 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575df7c864-lbv94" event={"ID":"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8","Type":"ContainerDied","Data":"5d3625f2766e914b8f88be4116ee8067b151d5f04dec70cd325693a063cc7704"} Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.512753 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.513114 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="proxy-httpd" containerID="cri-o://54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7" gracePeriod=30 Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.513135 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="ceilometer-notification-agent" containerID="cri-o://3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58" gracePeriod=30 Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.513086 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="ceilometer-central-agent" containerID="cri-o://0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19" gracePeriod=30 Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.513129 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="sg-core" containerID="cri-o://a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567" gracePeriod=30 Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.732659 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.780120 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-575df7c864-lbv94" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Oct 07 14:13:31 crc kubenswrapper[4717]: I1007 14:13:31.780165 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-575df7c864-lbv94" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.166:9311/healthcheck\": dial tcp 10.217.0.166:9311: connect: connection refused" Oct 07 14:13:32 crc kubenswrapper[4717]: I1007 14:13:32.315871 4717 generic.go:334] "Generic (PLEG): container finished" podID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerID="54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7" exitCode=0 Oct 07 14:13:32 crc kubenswrapper[4717]: I1007 14:13:32.315933 4717 generic.go:334] "Generic (PLEG): container finished" podID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerID="a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567" exitCode=2 Oct 07 14:13:32 crc kubenswrapper[4717]: I1007 14:13:32.315953 4717 generic.go:334] "Generic (PLEG): container finished" podID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerID="0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19" exitCode=0 Oct 07 14:13:32 crc kubenswrapper[4717]: I1007 14:13:32.315977 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4970ff22-3916-470e-aa5e-d2464fd0f905","Type":"ContainerDied","Data":"54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7"} Oct 07 14:13:32 crc kubenswrapper[4717]: I1007 14:13:32.316053 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4970ff22-3916-470e-aa5e-d2464fd0f905","Type":"ContainerDied","Data":"a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567"} Oct 07 14:13:32 crc kubenswrapper[4717]: I1007 14:13:32.316070 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4970ff22-3916-470e-aa5e-d2464fd0f905","Type":"ContainerDied","Data":"0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19"} Oct 07 14:13:33 crc kubenswrapper[4717]: I1007 14:13:33.960461 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fdlxw" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.120828 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-job-config-data\") pod \"1703fd90-e328-4f67-850f-38f8663dd2c2\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.121200 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-combined-ca-bundle\") pod \"1703fd90-e328-4f67-850f-38f8663dd2c2\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.121269 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb4mp\" (UniqueName: \"kubernetes.io/projected/1703fd90-e328-4f67-850f-38f8663dd2c2-kube-api-access-qb4mp\") pod \"1703fd90-e328-4f67-850f-38f8663dd2c2\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.121318 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-config-data\") pod \"1703fd90-e328-4f67-850f-38f8663dd2c2\" (UID: \"1703fd90-e328-4f67-850f-38f8663dd2c2\") " Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.134306 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "1703fd90-e328-4f67-850f-38f8663dd2c2" (UID: "1703fd90-e328-4f67-850f-38f8663dd2c2"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.137974 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1703fd90-e328-4f67-850f-38f8663dd2c2-kube-api-access-qb4mp" (OuterVolumeSpecName: "kube-api-access-qb4mp") pod "1703fd90-e328-4f67-850f-38f8663dd2c2" (UID: "1703fd90-e328-4f67-850f-38f8663dd2c2"). InnerVolumeSpecName "kube-api-access-qb4mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.138111 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-config-data" (OuterVolumeSpecName: "config-data") pod "1703fd90-e328-4f67-850f-38f8663dd2c2" (UID: "1703fd90-e328-4f67-850f-38f8663dd2c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.167171 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1703fd90-e328-4f67-850f-38f8663dd2c2" (UID: "1703fd90-e328-4f67-850f-38f8663dd2c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.226285 4717 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.226319 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.226331 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb4mp\" (UniqueName: \"kubernetes.io/projected/1703fd90-e328-4f67-850f-38f8663dd2c2-kube-api-access-qb4mp\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.226341 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1703fd90-e328-4f67-850f-38f8663dd2c2-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.289822 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.333642 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-fdlxw" event={"ID":"1703fd90-e328-4f67-850f-38f8663dd2c2","Type":"ContainerDied","Data":"d687060f694cb1bbd361cdf0f05684c9ace6fffcb9c7bb5848a39a3af04e19e0"} Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.333690 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d687060f694cb1bbd361cdf0f05684c9ace6fffcb9c7bb5848a39a3af04e19e0" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.333749 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-fdlxw" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.336734 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2","Type":"ContainerStarted","Data":"00612eebc5850b14b1f875f1be3c61defda7f6f42fbf62a33d8e473dd6befbe9"} Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.345089 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-575df7c864-lbv94" event={"ID":"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8","Type":"ContainerDied","Data":"7589b8c423f48553d09ac8eabbdfa2c199e2b9106448b64d81ce03c45429f121"} Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.345134 4717 scope.go:117] "RemoveContainer" containerID="5d3625f2766e914b8f88be4116ee8067b151d5f04dec70cd325693a063cc7704" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.345232 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-575df7c864-lbv94" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.379206 4717 scope.go:117] "RemoveContainer" containerID="740d795e0cb5f6de980dd0eabcf8cf566f0e42a9595d7a1b2b13c7fe407b3e83" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.433910 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-combined-ca-bundle\") pod \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.434265 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqzk4\" (UniqueName: \"kubernetes.io/projected/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-kube-api-access-gqzk4\") pod \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.434296 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data\") pod \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.434415 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data-custom\") pod \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.434536 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-logs\") pod \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\" (UID: \"506e8cb1-00bf-414a-b0c5-1eb6935b0ed8\") " Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.435667 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-logs" (OuterVolumeSpecName: "logs") pod "506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" (UID: "506e8cb1-00bf-414a-b0c5-1eb6935b0ed8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.437694 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" (UID: "506e8cb1-00bf-414a-b0c5-1eb6935b0ed8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.440029 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-kube-api-access-gqzk4" (OuterVolumeSpecName: "kube-api-access-gqzk4") pod "506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" (UID: "506e8cb1-00bf-414a-b0c5-1eb6935b0ed8"). InnerVolumeSpecName "kube-api-access-gqzk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.470535 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" (UID: "506e8cb1-00bf-414a-b0c5-1eb6935b0ed8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.485450 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data" (OuterVolumeSpecName: "config-data") pod "506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" (UID: "506e8cb1-00bf-414a-b0c5-1eb6935b0ed8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.502617 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.5655430470000002 podStartE2EDuration="16.502593053s" podCreationTimestamp="2025-10-07 14:13:18 +0000 UTC" firstStartedPulling="2025-10-07 14:13:19.869725899 +0000 UTC m=+1181.697651701" lastFinishedPulling="2025-10-07 14:13:33.806775915 +0000 UTC m=+1195.634701707" observedRunningTime="2025-10-07 14:13:34.358795462 +0000 UTC m=+1196.186721254" watchObservedRunningTime="2025-10-07 14:13:34.502593053 +0000 UTC m=+1196.330518845" Oct 07 14:13:34 crc kubenswrapper[4717]: W1007 14:13:34.502901 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf44baabf_f1c4_4036_8c6b_ce32cc6cf541.slice/crio-3fd85ec74d21fc11e399d07fdae3df088b14c18158cce575663cb0097b255c33 WatchSource:0}: Error finding container 3fd85ec74d21fc11e399d07fdae3df088b14c18158cce575663cb0097b255c33: Status 404 returned error can't find the container with id 3fd85ec74d21fc11e399d07fdae3df088b14c18158cce575663cb0097b255c33 Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.509846 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-9896cd659-vvdxn"] Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.536974 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.537032 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.537052 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqzk4\" (UniqueName: \"kubernetes.io/projected/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-kube-api-access-gqzk4\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.537065 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.537077 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.679105 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-575df7c864-lbv94"] Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.692862 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-575df7c864-lbv94"] Oct 07 14:13:34 crc kubenswrapper[4717]: I1007 14:13:34.880506 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" path="/var/lib/kubelet/pods/506e8cb1-00bf-414a-b0c5-1eb6935b0ed8/volumes" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.221726 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:13:35 crc kubenswrapper[4717]: E1007 14:13:35.222123 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1703fd90-e328-4f67-850f-38f8663dd2c2" containerName="manila-db-sync" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.222135 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1703fd90-e328-4f67-850f-38f8663dd2c2" containerName="manila-db-sync" Oct 07 14:13:35 crc kubenswrapper[4717]: E1007 14:13:35.222150 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerName="barbican-api" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.222156 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerName="barbican-api" Oct 07 14:13:35 crc kubenswrapper[4717]: E1007 14:13:35.222181 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerName="barbican-api-log" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.222187 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerName="barbican-api-log" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.222356 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerName="barbican-api" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.222367 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="506e8cb1-00bf-414a-b0c5-1eb6935b0ed8" containerName="barbican-api-log" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.222378 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1703fd90-e328-4f67-850f-38f8663dd2c2" containerName="manila-db-sync" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.223314 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.226357 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.226676 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-trdg9" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.227997 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.238362 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.239761 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.241528 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.241788 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.271140 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.336836 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355270 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355349 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355400 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355453 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355483 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355515 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355551 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tns6\" (UniqueName: \"kubernetes.io/projected/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-kube-api-access-6tns6\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355581 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-scripts\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355620 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-scripts\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355647 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355722 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dwzt\" (UniqueName: \"kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-kube-api-access-5dwzt\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355810 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355839 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.355866 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-ceph\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.358280 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-6k48l"] Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.372637 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.394952 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-6k48l"] Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.397353 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9896cd659-vvdxn" event={"ID":"f44baabf-f1c4-4036-8c6b-ce32cc6cf541","Type":"ContainerStarted","Data":"ef63cfdfb518e8494d9a8c3b1fc6deb383058dc5c5352dd3e7b8450b44b832d0"} Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.397394 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9896cd659-vvdxn" event={"ID":"f44baabf-f1c4-4036-8c6b-ce32cc6cf541","Type":"ContainerStarted","Data":"ddaa4247cd329a6617f645c5f20a1ff68011883f719bd5d5eb914a4887827619"} Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.397405 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-9896cd659-vvdxn" event={"ID":"f44baabf-f1c4-4036-8c6b-ce32cc6cf541","Type":"ContainerStarted","Data":"3fd85ec74d21fc11e399d07fdae3df088b14c18158cce575663cb0097b255c33"} Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.397439 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.397540 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.434656 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-9896cd659-vvdxn" podStartSLOduration=6.434618065 podStartE2EDuration="6.434618065s" podCreationTimestamp="2025-10-07 14:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:35.426970335 +0000 UTC m=+1197.254896127" watchObservedRunningTime="2025-10-07 14:13:35.434618065 +0000 UTC m=+1197.262543857" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459384 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459543 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459571 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-ceph\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459619 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459656 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-config\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459691 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459768 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459812 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459848 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459891 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459921 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tns6\" (UniqueName: \"kubernetes.io/projected/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-kube-api-access-6tns6\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.459958 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-scripts\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.460448 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-scripts\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.460481 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.460510 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-sb\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.460564 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-svc\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.460646 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dwzt\" (UniqueName: \"kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-kube-api-access-5dwzt\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.461325 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.461410 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.461858 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgmqz\" (UniqueName: \"kubernetes.io/projected/54988e9a-d6e5-46da-b117-ed0dc218d42c-kube-api-access-rgmqz\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.461896 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-nb\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.461966 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-swift-storage-0\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.464175 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.472459 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.475858 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-ceph\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.476694 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.476736 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.477232 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-scripts\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.477618 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.479434 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-scripts\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.485742 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.492103 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tns6\" (UniqueName: \"kubernetes.io/projected/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-kube-api-access-6tns6\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.500213 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.506337 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dwzt\" (UniqueName: \"kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-kube-api-access-5dwzt\") pod \"manila-share-share1-0\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.532838 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.534375 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.539715 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.545640 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.552711 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.563930 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-svc\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.564022 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgmqz\" (UniqueName: \"kubernetes.io/projected/54988e9a-d6e5-46da-b117-ed0dc218d42c-kube-api-access-rgmqz\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.564044 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-nb\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.564059 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-swift-storage-0\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.564130 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-config\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.564201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-sb\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.565109 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-sb\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.565625 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-svc\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.566414 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-nb\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.566940 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-config\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.567176 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-swift-storage-0\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.576486 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.592992 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgmqz\" (UniqueName: \"kubernetes.io/projected/54988e9a-d6e5-46da-b117-ed0dc218d42c-kube-api-access-rgmqz\") pod \"dnsmasq-dns-5865f9d689-6k48l\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.666251 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.666315 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-logs\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.666337 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fscm7\" (UniqueName: \"kubernetes.io/projected/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-kube-api-access-fscm7\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.666399 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.666433 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-etc-machine-id\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.666457 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data-custom\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.666488 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-scripts\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.696739 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.774399 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-logs\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.774438 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fscm7\" (UniqueName: \"kubernetes.io/projected/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-kube-api-access-fscm7\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.774510 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.774558 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-etc-machine-id\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.774591 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data-custom\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.774635 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-scripts\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.774699 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.774849 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-logs\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.774919 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-etc-machine-id\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.783021 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data-custom\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.784084 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.786756 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-scripts\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.788134 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.799787 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fscm7\" (UniqueName: \"kubernetes.io/projected/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-kube-api-access-fscm7\") pod \"manila-api-0\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " pod="openstack/manila-api-0" Oct 07 14:13:35 crc kubenswrapper[4717]: I1007 14:13:35.913758 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.211176 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.211962 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1bea9493-f1bb-4bce-8d15-f18fc71b3df1" containerName="kube-state-metrics" containerID="cri-o://0f2d92a701650746410e1d309e2fc45a1b827ddb259fe6416a7b240e21878aeb" gracePeriod=30 Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.227907 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.257536 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.316446 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-6k48l"] Oct 07 14:13:36 crc kubenswrapper[4717]: E1007 14:13:36.373310 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bea9493_f1bb_4bce_8d15_f18fc71b3df1.slice/crio-conmon-0f2d92a701650746410e1d309e2fc45a1b827ddb259fe6416a7b240e21878aeb.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bea9493_f1bb_4bce_8d15_f18fc71b3df1.slice/crio-0f2d92a701650746410e1d309e2fc45a1b827ddb259fe6416a7b240e21878aeb.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.432784 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce","Type":"ContainerStarted","Data":"1556651da3dbe2c23a1fcd97a5bd25122b6f4de856f5153b70b13ffabdbe92eb"} Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.434121 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" event={"ID":"54988e9a-d6e5-46da-b117-ed0dc218d42c","Type":"ContainerStarted","Data":"ea25dd4a787b903b4037d57b089a89cec974d843cf09d4ca5b681860b07eeded"} Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.458227 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8239b76e-540d-4a4e-bab7-889cd9ee9d3f","Type":"ContainerStarted","Data":"e497cae101c38bb14810ce9c1280819e18e896b653f30c592a6c8ef11552005d"} Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.466364 4717 generic.go:334] "Generic (PLEG): container finished" podID="1bea9493-f1bb-4bce-8d15-f18fc71b3df1" containerID="0f2d92a701650746410e1d309e2fc45a1b827ddb259fe6416a7b240e21878aeb" exitCode=2 Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.469422 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1bea9493-f1bb-4bce-8d15-f18fc71b3df1","Type":"ContainerDied","Data":"0f2d92a701650746410e1d309e2fc45a1b827ddb259fe6416a7b240e21878aeb"} Oct 07 14:13:36 crc kubenswrapper[4717]: I1007 14:13:36.588712 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.005716 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.114903 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfw6x\" (UniqueName: \"kubernetes.io/projected/1bea9493-f1bb-4bce-8d15-f18fc71b3df1-kube-api-access-qfw6x\") pod \"1bea9493-f1bb-4bce-8d15-f18fc71b3df1\" (UID: \"1bea9493-f1bb-4bce-8d15-f18fc71b3df1\") " Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.122053 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bea9493-f1bb-4bce-8d15-f18fc71b3df1-kube-api-access-qfw6x" (OuterVolumeSpecName: "kube-api-access-qfw6x") pod "1bea9493-f1bb-4bce-8d15-f18fc71b3df1" (UID: "1bea9493-f1bb-4bce-8d15-f18fc71b3df1"). InnerVolumeSpecName "kube-api-access-qfw6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.220381 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfw6x\" (UniqueName: \"kubernetes.io/projected/1bea9493-f1bb-4bce-8d15-f18fc71b3df1-kube-api-access-qfw6x\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.333125 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.430743 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-config-data\") pod \"4970ff22-3916-470e-aa5e-d2464fd0f905\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.431105 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-run-httpd\") pod \"4970ff22-3916-470e-aa5e-d2464fd0f905\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.431161 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-combined-ca-bundle\") pod \"4970ff22-3916-470e-aa5e-d2464fd0f905\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.431194 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-scripts\") pod \"4970ff22-3916-470e-aa5e-d2464fd0f905\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.431231 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-log-httpd\") pod \"4970ff22-3916-470e-aa5e-d2464fd0f905\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.431363 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-sg-core-conf-yaml\") pod \"4970ff22-3916-470e-aa5e-d2464fd0f905\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.431425 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbqwq\" (UniqueName: \"kubernetes.io/projected/4970ff22-3916-470e-aa5e-d2464fd0f905-kube-api-access-xbqwq\") pod \"4970ff22-3916-470e-aa5e-d2464fd0f905\" (UID: \"4970ff22-3916-470e-aa5e-d2464fd0f905\") " Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.432829 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4970ff22-3916-470e-aa5e-d2464fd0f905" (UID: "4970ff22-3916-470e-aa5e-d2464fd0f905"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.433585 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4970ff22-3916-470e-aa5e-d2464fd0f905" (UID: "4970ff22-3916-470e-aa5e-d2464fd0f905"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.435708 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-scripts" (OuterVolumeSpecName: "scripts") pod "4970ff22-3916-470e-aa5e-d2464fd0f905" (UID: "4970ff22-3916-470e-aa5e-d2464fd0f905"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.435824 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4970ff22-3916-470e-aa5e-d2464fd0f905-kube-api-access-xbqwq" (OuterVolumeSpecName: "kube-api-access-xbqwq") pod "4970ff22-3916-470e-aa5e-d2464fd0f905" (UID: "4970ff22-3916-470e-aa5e-d2464fd0f905"). InnerVolumeSpecName "kube-api-access-xbqwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.468374 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4970ff22-3916-470e-aa5e-d2464fd0f905" (UID: "4970ff22-3916-470e-aa5e-d2464fd0f905"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.486469 4717 generic.go:334] "Generic (PLEG): container finished" podID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerID="3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58" exitCode=0 Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.486539 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4970ff22-3916-470e-aa5e-d2464fd0f905","Type":"ContainerDied","Data":"3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58"} Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.486569 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4970ff22-3916-470e-aa5e-d2464fd0f905","Type":"ContainerDied","Data":"c82b637562d5ff50cedab51c4c4b736b0654ea6b5b5abccf6794f8a6ff3a4731"} Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.486588 4717 scope.go:117] "RemoveContainer" containerID="54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.486725 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.490380 4717 generic.go:334] "Generic (PLEG): container finished" podID="54988e9a-d6e5-46da-b117-ed0dc218d42c" containerID="e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49" exitCode=0 Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.490491 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" event={"ID":"54988e9a-d6e5-46da-b117-ed0dc218d42c","Type":"ContainerDied","Data":"e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49"} Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.499932 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.500340 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1bea9493-f1bb-4bce-8d15-f18fc71b3df1","Type":"ContainerDied","Data":"a73447455ba02d450fcb9261323de3d993de56b49a2c2e61ff7aecb78c179201"} Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.504251 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"249be3b9-466c-4902-b8b2-ff2ef54dd8b1","Type":"ContainerStarted","Data":"55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a"} Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.504297 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"249be3b9-466c-4902-b8b2-ff2ef54dd8b1","Type":"ContainerStarted","Data":"3ff5a3268ef868b45891c6cc5e0ec90faa9a2c231a53ddc50b6d6025c5a1d882"} Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.534026 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.534061 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbqwq\" (UniqueName: \"kubernetes.io/projected/4970ff22-3916-470e-aa5e-d2464fd0f905-kube-api-access-xbqwq\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.534073 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.534084 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.534095 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4970ff22-3916-470e-aa5e-d2464fd0f905-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.582165 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4970ff22-3916-470e-aa5e-d2464fd0f905" (UID: "4970ff22-3916-470e-aa5e-d2464fd0f905"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.582256 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.614239 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.626429 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:13:37 crc kubenswrapper[4717]: E1007 14:13:37.626964 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="ceilometer-central-agent" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.626983 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="ceilometer-central-agent" Oct 07 14:13:37 crc kubenswrapper[4717]: E1007 14:13:37.627051 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="sg-core" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.627062 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="sg-core" Oct 07 14:13:37 crc kubenswrapper[4717]: E1007 14:13:37.627082 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bea9493-f1bb-4bce-8d15-f18fc71b3df1" containerName="kube-state-metrics" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.627091 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bea9493-f1bb-4bce-8d15-f18fc71b3df1" containerName="kube-state-metrics" Oct 07 14:13:37 crc kubenswrapper[4717]: E1007 14:13:37.627110 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="proxy-httpd" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.627118 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="proxy-httpd" Oct 07 14:13:37 crc kubenswrapper[4717]: E1007 14:13:37.627130 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="ceilometer-notification-agent" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.627138 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="ceilometer-notification-agent" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.627380 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="ceilometer-notification-agent" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.627395 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="sg-core" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.627410 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="ceilometer-central-agent" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.627423 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bea9493-f1bb-4bce-8d15-f18fc71b3df1" containerName="kube-state-metrics" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.627438 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" containerName="proxy-httpd" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.628259 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.630729 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.632150 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-config-data" (OuterVolumeSpecName: "config-data") pod "4970ff22-3916-470e-aa5e-d2464fd0f905" (UID: "4970ff22-3916-470e-aa5e-d2464fd0f905"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.632337 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.635827 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.635865 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4970ff22-3916-470e-aa5e-d2464fd0f905-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.641596 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.648082 4717 scope.go:117] "RemoveContainer" containerID="a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.703448 4717 scope.go:117] "RemoveContainer" containerID="3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.737086 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcc7eb6-640f-4300-9943-2ba004773b3b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.737278 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bcc7eb6-640f-4300-9943-2ba004773b3b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.737323 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4v79\" (UniqueName: \"kubernetes.io/projected/9bcc7eb6-640f-4300-9943-2ba004773b3b-kube-api-access-c4v79\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.737360 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9bcc7eb6-640f-4300-9943-2ba004773b3b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.788273 4717 scope.go:117] "RemoveContainer" containerID="0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.839396 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bcc7eb6-640f-4300-9943-2ba004773b3b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.839873 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4v79\" (UniqueName: \"kubernetes.io/projected/9bcc7eb6-640f-4300-9943-2ba004773b3b-kube-api-access-c4v79\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.839971 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9bcc7eb6-640f-4300-9943-2ba004773b3b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.840099 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcc7eb6-640f-4300-9943-2ba004773b3b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.844710 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bcc7eb6-640f-4300-9943-2ba004773b3b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.846757 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9bcc7eb6-640f-4300-9943-2ba004773b3b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.854484 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bcc7eb6-640f-4300-9943-2ba004773b3b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.858847 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4v79\" (UniqueName: \"kubernetes.io/projected/9bcc7eb6-640f-4300-9943-2ba004773b3b-kube-api-access-c4v79\") pod \"kube-state-metrics-0\" (UID: \"9bcc7eb6-640f-4300-9943-2ba004773b3b\") " pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.961077 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 07 14:13:37 crc kubenswrapper[4717]: I1007 14:13:37.992458 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.017167 4717 scope.go:117] "RemoveContainer" containerID="54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7" Oct 07 14:13:38 crc kubenswrapper[4717]: E1007 14:13:38.021349 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7\": container with ID starting with 54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7 not found: ID does not exist" containerID="54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.021395 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7"} err="failed to get container status \"54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7\": rpc error: code = NotFound desc = could not find container \"54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7\": container with ID starting with 54ed3b43267446b3d838af2b9a29fce7f5224cd46387beae3ccf9d0190551bd7 not found: ID does not exist" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.021420 4717 scope.go:117] "RemoveContainer" containerID="a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.023891 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:38 crc kubenswrapper[4717]: E1007 14:13:38.026297 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567\": container with ID starting with a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567 not found: ID does not exist" containerID="a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.026352 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567"} err="failed to get container status \"a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567\": rpc error: code = NotFound desc = could not find container \"a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567\": container with ID starting with a7ba81309dab82b512e7a924b275896ef4b8f02abb92fa9a6bc15829b6fbe567 not found: ID does not exist" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.026385 4717 scope.go:117] "RemoveContainer" containerID="3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58" Oct 07 14:13:38 crc kubenswrapper[4717]: E1007 14:13:38.027125 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58\": container with ID starting with 3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58 not found: ID does not exist" containerID="3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.027154 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58"} err="failed to get container status \"3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58\": rpc error: code = NotFound desc = could not find container \"3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58\": container with ID starting with 3e697248a63b68ee4f95c5508b234c4af5a33779590fe33f8cc64efbf21a1b58 not found: ID does not exist" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.027172 4717 scope.go:117] "RemoveContainer" containerID="0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19" Oct 07 14:13:38 crc kubenswrapper[4717]: E1007 14:13:38.027418 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19\": container with ID starting with 0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19 not found: ID does not exist" containerID="0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.027449 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19"} err="failed to get container status \"0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19\": rpc error: code = NotFound desc = could not find container \"0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19\": container with ID starting with 0a64bbe6993714921420b3630a6738f55556e90cac5c68e0ec4e5cc475b31b19 not found: ID does not exist" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.027467 4717 scope.go:117] "RemoveContainer" containerID="0f2d92a701650746410e1d309e2fc45a1b827ddb259fe6416a7b240e21878aeb" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.040116 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.042832 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.045465 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.045667 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.048784 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.146521 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.146795 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-log-httpd\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.146883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-run-httpd\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.146908 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-config-data\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.147585 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd9gk\" (UniqueName: \"kubernetes.io/projected/99a3cf21-f136-426d-a894-226ba3c2d20b-kube-api-access-gd9gk\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.147693 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.147802 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-scripts\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.250655 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.250699 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-log-httpd\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.250793 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-run-httpd\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.250819 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-config-data\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.250848 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd9gk\" (UniqueName: \"kubernetes.io/projected/99a3cf21-f136-426d-a894-226ba3c2d20b-kube-api-access-gd9gk\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.250877 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.250907 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-scripts\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.251384 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-log-httpd\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.251404 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-run-httpd\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.256511 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-config-data\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.257472 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.258757 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-scripts\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.259493 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.279883 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd9gk\" (UniqueName: \"kubernetes.io/projected/99a3cf21-f136-426d-a894-226ba3c2d20b-kube-api-access-gd9gk\") pod \"ceilometer-0\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.376862 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.531994 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.537019 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" event={"ID":"54988e9a-d6e5-46da-b117-ed0dc218d42c","Type":"ContainerStarted","Data":"a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd"} Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.537675 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.543797 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8239b76e-540d-4a4e-bab7-889cd9ee9d3f","Type":"ContainerStarted","Data":"a118246ca00080d5080df0ce6cdabe2cf683953d0dc337fcc159ab1e4e95d4c6"} Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.559999 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" podStartSLOduration=3.559977423 podStartE2EDuration="3.559977423s" podCreationTimestamp="2025-10-07 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:38.556321963 +0000 UTC m=+1200.384247755" watchObservedRunningTime="2025-10-07 14:13:38.559977423 +0000 UTC m=+1200.387903215" Oct 07 14:13:38 crc kubenswrapper[4717]: W1007 14:13:38.569271 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bcc7eb6_640f_4300_9943_2ba004773b3b.slice/crio-a6f797c10247f31f19a7b26adededd60865943c0fbcc13bde6102ee004478d2f WatchSource:0}: Error finding container a6f797c10247f31f19a7b26adededd60865943c0fbcc13bde6102ee004478d2f: Status 404 returned error can't find the container with id a6f797c10247f31f19a7b26adededd60865943c0fbcc13bde6102ee004478d2f Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.574177 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"249be3b9-466c-4902-b8b2-ff2ef54dd8b1","Type":"ContainerStarted","Data":"23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7"} Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.575467 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.615155 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.615135165 podStartE2EDuration="3.615135165s" podCreationTimestamp="2025-10-07 14:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:38.602492918 +0000 UTC m=+1200.430418700" watchObservedRunningTime="2025-10-07 14:13:38.615135165 +0000 UTC m=+1200.443060957" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.900745 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bea9493-f1bb-4bce-8d15-f18fc71b3df1" path="/var/lib/kubelet/pods/1bea9493-f1bb-4bce-8d15-f18fc71b3df1/volumes" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.901774 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4970ff22-3916-470e-aa5e-d2464fd0f905" path="/var/lib/kubelet/pods/4970ff22-3916-470e-aa5e-d2464fd0f905/volumes" Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.958702 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:38 crc kubenswrapper[4717]: I1007 14:13:38.991530 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.227490 4717 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podcb6f9762-a7d3-48ee-97ce-57439f4ee323"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podcb6f9762-a7d3-48ee-97ce-57439f4ee323] : Timed out while waiting for systemd to remove kubepods-besteffort-podcb6f9762_a7d3_48ee_97ce_57439f4ee323.slice" Oct 07 14:13:39 crc kubenswrapper[4717]: E1007 14:13:39.227543 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podcb6f9762-a7d3-48ee-97ce-57439f4ee323] : unable to destroy cgroup paths for cgroup [kubepods besteffort podcb6f9762-a7d3-48ee-97ce-57439f4ee323] : Timed out while waiting for systemd to remove kubepods-besteffort-podcb6f9762_a7d3_48ee_97ce_57439f4ee323.slice" pod="openstack/neutron-db-sync-8w5dj" podUID="cb6f9762-a7d3-48ee-97ce-57439f4ee323" Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.305755 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.432146 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.643650 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99a3cf21-f136-426d-a894-226ba3c2d20b","Type":"ContainerStarted","Data":"a39b811b7d965f57ff59c8570d76d6dc6ecf7b46f7e7a7f8317183554424430c"} Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.662659 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9bcc7eb6-640f-4300-9943-2ba004773b3b","Type":"ContainerStarted","Data":"9f0cec8159d2fc1fcd62a6878f898d8db073e7bcd2110b8be740db3ee10fe254"} Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.662710 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9bcc7eb6-640f-4300-9943-2ba004773b3b","Type":"ContainerStarted","Data":"a6f797c10247f31f19a7b26adededd60865943c0fbcc13bde6102ee004478d2f"} Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.663981 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.675405 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8w5dj" Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.677063 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8239b76e-540d-4a4e-bab7-889cd9ee9d3f","Type":"ContainerStarted","Data":"3d04e1bbbd495dfc2b9b5c228bde2d5852a718544d4f656b78477b9512c2e905"} Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.699998 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.251507675 podStartE2EDuration="2.699980185s" podCreationTimestamp="2025-10-07 14:13:37 +0000 UTC" firstStartedPulling="2025-10-07 14:13:38.576928138 +0000 UTC m=+1200.404853930" lastFinishedPulling="2025-10-07 14:13:39.025400648 +0000 UTC m=+1200.853326440" observedRunningTime="2025-10-07 14:13:39.698533705 +0000 UTC m=+1201.526459497" watchObservedRunningTime="2025-10-07 14:13:39.699980185 +0000 UTC m=+1201.527905977" Oct 07 14:13:39 crc kubenswrapper[4717]: I1007 14:13:39.741284 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.207264577 podStartE2EDuration="4.741265176s" podCreationTimestamp="2025-10-07 14:13:35 +0000 UTC" firstStartedPulling="2025-10-07 14:13:36.259623774 +0000 UTC m=+1198.087549566" lastFinishedPulling="2025-10-07 14:13:37.793624373 +0000 UTC m=+1199.621550165" observedRunningTime="2025-10-07 14:13:39.736234038 +0000 UTC m=+1201.564159830" watchObservedRunningTime="2025-10-07 14:13:39.741265176 +0000 UTC m=+1201.569190968" Oct 07 14:13:40 crc kubenswrapper[4717]: I1007 14:13:40.403314 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:40 crc kubenswrapper[4717]: I1007 14:13:40.404474 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-9896cd659-vvdxn" Oct 07 14:13:40 crc kubenswrapper[4717]: I1007 14:13:40.586592 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:13:40 crc kubenswrapper[4717]: I1007 14:13:40.586825 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1127e471-e2a5-436b-8433-c56124359062" containerName="glance-log" containerID="cri-o://0f04054b9015786809c1f294f2e557e2875402fdf4d407d2b423c550565c207a" gracePeriod=30 Oct 07 14:13:40 crc kubenswrapper[4717]: I1007 14:13:40.586964 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1127e471-e2a5-436b-8433-c56124359062" containerName="glance-httpd" containerID="cri-o://1f7b6873101b80934ebcd8b04468facfc3035d0249009cd29f3246b3808ae2ab" gracePeriod=30 Oct 07 14:13:40 crc kubenswrapper[4717]: I1007 14:13:40.693130 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99a3cf21-f136-426d-a894-226ba3c2d20b","Type":"ContainerStarted","Data":"34eb74cecb9e9dc300b8ec1c675d73644144336140dec5974a7e374f30691918"} Oct 07 14:13:40 crc kubenswrapper[4717]: I1007 14:13:40.693699 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99a3cf21-f136-426d-a894-226ba3c2d20b","Type":"ContainerStarted","Data":"90d3cb8ee609f9347cf9ac6f2259f3dcbb6dec78fdf72356936980ca6fb5fd71"} Oct 07 14:13:40 crc kubenswrapper[4717]: I1007 14:13:40.695374 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" containerName="manila-api-log" containerID="cri-o://55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a" gracePeriod=30 Oct 07 14:13:40 crc kubenswrapper[4717]: I1007 14:13:40.695807 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" containerName="manila-api" containerID="cri-o://23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7" gracePeriod=30 Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.397023 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zlz22"] Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.398580 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zlz22" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.421883 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zlz22"] Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.440619 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.484753 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-bgz59"] Oct 07 14:13:41 crc kubenswrapper[4717]: E1007 14:13:41.485349 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" containerName="manila-api-log" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.485365 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" containerName="manila-api-log" Oct 07 14:13:41 crc kubenswrapper[4717]: E1007 14:13:41.485395 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" containerName="manila-api" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.485420 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" containerName="manila-api" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.485744 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" containerName="manila-api-log" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.485779 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" containerName="manila-api" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.486822 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bgz59" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.532361 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bgz59"] Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.573501 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jrb7v"] Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.574839 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jrb7v" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.578696 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data\") pod \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.578838 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-scripts\") pod \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.578885 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data-custom\") pod \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.578933 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-etc-machine-id\") pod \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.579022 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-combined-ca-bundle\") pod \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.579059 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-logs\") pod \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.579093 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fscm7\" (UniqueName: \"kubernetes.io/projected/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-kube-api-access-fscm7\") pod \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\" (UID: \"249be3b9-466c-4902-b8b2-ff2ef54dd8b1\") " Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.579333 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkd6h\" (UniqueName: \"kubernetes.io/projected/f29c601f-2095-4f1f-baf8-8c79792118dd-kube-api-access-fkd6h\") pod \"nova-api-db-create-zlz22\" (UID: \"f29c601f-2095-4f1f-baf8-8c79792118dd\") " pod="openstack/nova-api-db-create-zlz22" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.579512 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p768z\" (UniqueName: \"kubernetes.io/projected/3576d729-52a6-41c4-b070-64f86f9bc55b-kube-api-access-p768z\") pod \"nova-cell0-db-create-bgz59\" (UID: \"3576d729-52a6-41c4-b070-64f86f9bc55b\") " pod="openstack/nova-cell0-db-create-bgz59" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.580076 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "249be3b9-466c-4902-b8b2-ff2ef54dd8b1" (UID: "249be3b9-466c-4902-b8b2-ff2ef54dd8b1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.580389 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-logs" (OuterVolumeSpecName: "logs") pod "249be3b9-466c-4902-b8b2-ff2ef54dd8b1" (UID: "249be3b9-466c-4902-b8b2-ff2ef54dd8b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.583698 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jrb7v"] Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.584302 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-kube-api-access-fscm7" (OuterVolumeSpecName: "kube-api-access-fscm7") pod "249be3b9-466c-4902-b8b2-ff2ef54dd8b1" (UID: "249be3b9-466c-4902-b8b2-ff2ef54dd8b1"). InnerVolumeSpecName "kube-api-access-fscm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.585885 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-scripts" (OuterVolumeSpecName: "scripts") pod "249be3b9-466c-4902-b8b2-ff2ef54dd8b1" (UID: "249be3b9-466c-4902-b8b2-ff2ef54dd8b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.586883 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "249be3b9-466c-4902-b8b2-ff2ef54dd8b1" (UID: "249be3b9-466c-4902-b8b2-ff2ef54dd8b1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.613169 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "249be3b9-466c-4902-b8b2-ff2ef54dd8b1" (UID: "249be3b9-466c-4902-b8b2-ff2ef54dd8b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.650283 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data" (OuterVolumeSpecName: "config-data") pod "249be3b9-466c-4902-b8b2-ff2ef54dd8b1" (UID: "249be3b9-466c-4902-b8b2-ff2ef54dd8b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.681636 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p768z\" (UniqueName: \"kubernetes.io/projected/3576d729-52a6-41c4-b070-64f86f9bc55b-kube-api-access-p768z\") pod \"nova-cell0-db-create-bgz59\" (UID: \"3576d729-52a6-41c4-b070-64f86f9bc55b\") " pod="openstack/nova-cell0-db-create-bgz59" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.681684 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g69n5\" (UniqueName: \"kubernetes.io/projected/0061a317-01ad-4690-a87d-8e2e6f6f3344-kube-api-access-g69n5\") pod \"nova-cell1-db-create-jrb7v\" (UID: \"0061a317-01ad-4690-a87d-8e2e6f6f3344\") " pod="openstack/nova-cell1-db-create-jrb7v" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.681721 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkd6h\" (UniqueName: \"kubernetes.io/projected/f29c601f-2095-4f1f-baf8-8c79792118dd-kube-api-access-fkd6h\") pod \"nova-api-db-create-zlz22\" (UID: \"f29c601f-2095-4f1f-baf8-8c79792118dd\") " pod="openstack/nova-api-db-create-zlz22" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.681898 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.681918 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.681929 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.681940 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fscm7\" (UniqueName: \"kubernetes.io/projected/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-kube-api-access-fscm7\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.681953 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.681962 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.681972 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/249be3b9-466c-4902-b8b2-ff2ef54dd8b1-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.699412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkd6h\" (UniqueName: \"kubernetes.io/projected/f29c601f-2095-4f1f-baf8-8c79792118dd-kube-api-access-fkd6h\") pod \"nova-api-db-create-zlz22\" (UID: \"f29c601f-2095-4f1f-baf8-8c79792118dd\") " pod="openstack/nova-api-db-create-zlz22" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.700897 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p768z\" (UniqueName: \"kubernetes.io/projected/3576d729-52a6-41c4-b070-64f86f9bc55b-kube-api-access-p768z\") pod \"nova-cell0-db-create-bgz59\" (UID: \"3576d729-52a6-41c4-b070-64f86f9bc55b\") " pod="openstack/nova-cell0-db-create-bgz59" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.724665 4717 generic.go:334] "Generic (PLEG): container finished" podID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" containerID="23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7" exitCode=0 Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.724711 4717 generic.go:334] "Generic (PLEG): container finished" podID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" containerID="55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a" exitCode=143 Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.724752 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"249be3b9-466c-4902-b8b2-ff2ef54dd8b1","Type":"ContainerDied","Data":"23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7"} Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.724778 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"249be3b9-466c-4902-b8b2-ff2ef54dd8b1","Type":"ContainerDied","Data":"55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a"} Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.724790 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"249be3b9-466c-4902-b8b2-ff2ef54dd8b1","Type":"ContainerDied","Data":"3ff5a3268ef868b45891c6cc5e0ec90faa9a2c231a53ddc50b6d6025c5a1d882"} Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.724804 4717 scope.go:117] "RemoveContainer" containerID="23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.724936 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.749132 4717 generic.go:334] "Generic (PLEG): container finished" podID="1127e471-e2a5-436b-8433-c56124359062" containerID="0f04054b9015786809c1f294f2e557e2875402fdf4d407d2b423c550565c207a" exitCode=143 Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.749231 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1127e471-e2a5-436b-8433-c56124359062","Type":"ContainerDied","Data":"0f04054b9015786809c1f294f2e557e2875402fdf4d407d2b423c550565c207a"} Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.766313 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99a3cf21-f136-426d-a894-226ba3c2d20b","Type":"ContainerStarted","Data":"751cce8eab3a1d053417b258226f80ca98f6dbefe6e3a990dd5c53f57787a16a"} Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.770474 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zlz22" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.783877 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g69n5\" (UniqueName: \"kubernetes.io/projected/0061a317-01ad-4690-a87d-8e2e6f6f3344-kube-api-access-g69n5\") pod \"nova-cell1-db-create-jrb7v\" (UID: \"0061a317-01ad-4690-a87d-8e2e6f6f3344\") " pod="openstack/nova-cell1-db-create-jrb7v" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.803476 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g69n5\" (UniqueName: \"kubernetes.io/projected/0061a317-01ad-4690-a87d-8e2e6f6f3344-kube-api-access-g69n5\") pod \"nova-cell1-db-create-jrb7v\" (UID: \"0061a317-01ad-4690-a87d-8e2e6f6f3344\") " pod="openstack/nova-cell1-db-create-jrb7v" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.804367 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bgz59" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.823430 4717 scope.go:117] "RemoveContainer" containerID="55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.823461 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.834275 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.844116 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.846252 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.849977 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.850937 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.851082 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.857797 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.895286 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jrb7v" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.987723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-config-data\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.987783 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-internal-tls-certs\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.987813 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p75dt\" (UniqueName: \"kubernetes.io/projected/3351608d-f560-40bb-a1f3-c8711a80a7a4-kube-api-access-p75dt\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.988415 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3351608d-f560-40bb-a1f3-c8711a80a7a4-etc-machine-id\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.988746 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3351608d-f560-40bb-a1f3-c8711a80a7a4-logs\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.988785 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-scripts\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.988841 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.988936 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-config-data-custom\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:41 crc kubenswrapper[4717]: I1007 14:13:41.988975 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-public-tls-certs\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.080239 4717 scope.go:117] "RemoveContainer" containerID="23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7" Oct 07 14:13:42 crc kubenswrapper[4717]: E1007 14:13:42.083964 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7\": container with ID starting with 23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7 not found: ID does not exist" containerID="23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.084037 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7"} err="failed to get container status \"23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7\": rpc error: code = NotFound desc = could not find container \"23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7\": container with ID starting with 23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7 not found: ID does not exist" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.084068 4717 scope.go:117] "RemoveContainer" containerID="55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a" Oct 07 14:13:42 crc kubenswrapper[4717]: E1007 14:13:42.084582 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a\": container with ID starting with 55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a not found: ID does not exist" containerID="55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.084620 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a"} err="failed to get container status \"55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a\": rpc error: code = NotFound desc = could not find container \"55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a\": container with ID starting with 55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a not found: ID does not exist" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.084639 4717 scope.go:117] "RemoveContainer" containerID="23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.085268 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7"} err="failed to get container status \"23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7\": rpc error: code = NotFound desc = could not find container \"23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7\": container with ID starting with 23b7932272ace66b093e1680b917730cdd8723a8a104aa71d4f35f696ed5f7c7 not found: ID does not exist" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.085291 4717 scope.go:117] "RemoveContainer" containerID="55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.085702 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a"} err="failed to get container status \"55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a\": rpc error: code = NotFound desc = could not find container \"55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a\": container with ID starting with 55b238e52e8d6ba4dd738df27e038668adc66b644b06c8e5dfc1cb42c5d0cf7a not found: ID does not exist" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.093441 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-config-data\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.093503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-internal-tls-certs\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.093528 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p75dt\" (UniqueName: \"kubernetes.io/projected/3351608d-f560-40bb-a1f3-c8711a80a7a4-kube-api-access-p75dt\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.093588 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3351608d-f560-40bb-a1f3-c8711a80a7a4-etc-machine-id\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.095250 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3351608d-f560-40bb-a1f3-c8711a80a7a4-etc-machine-id\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.095657 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3351608d-f560-40bb-a1f3-c8711a80a7a4-logs\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.095757 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-scripts\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.095922 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.096038 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-config-data-custom\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.096076 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-public-tls-certs\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.106604 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-scripts\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.107487 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3351608d-f560-40bb-a1f3-c8711a80a7a4-logs\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.108417 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-internal-tls-certs\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.120209 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.126393 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-config-data-custom\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.127110 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p75dt\" (UniqueName: \"kubernetes.io/projected/3351608d-f560-40bb-a1f3-c8711a80a7a4-kube-api-access-p75dt\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.138130 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-config-data\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.138702 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3351608d-f560-40bb-a1f3-c8711a80a7a4-public-tls-certs\") pod \"manila-api-0\" (UID: \"3351608d-f560-40bb-a1f3-c8711a80a7a4\") " pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.314990 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.407265 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zlz22"] Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.564737 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bgz59"] Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.600059 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jrb7v"] Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.717639 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-59986d7f85-m2phf" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.785602 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bgz59" event={"ID":"3576d729-52a6-41c4-b070-64f86f9bc55b","Type":"ContainerStarted","Data":"1be0b2eb5adb177178451b9dc23e5efc99a2b1d9ea19b48442d217f3b5fc748a"} Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.799561 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zlz22" event={"ID":"f29c601f-2095-4f1f-baf8-8c79792118dd","Type":"ContainerStarted","Data":"0e78793318db11adec9b8d99ee0cf901588375e520aa6f4d374137c06585184c"} Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.799610 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zlz22" event={"ID":"f29c601f-2095-4f1f-baf8-8c79792118dd","Type":"ContainerStarted","Data":"325c6d38bdde30c656896cc356415d53027a1736c0506b71bc5bcfdd01dff3ad"} Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.802369 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jrb7v" event={"ID":"0061a317-01ad-4690-a87d-8e2e6f6f3344","Type":"ContainerStarted","Data":"a2731cea081607ae33e8a9a72ed4ff450844b9cf53b0a28abe7d249f20cd8325"} Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.803321 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b594ccc88-lfbdk"] Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.803594 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b594ccc88-lfbdk" podUID="278a49b0-beb4-430b-ad36-065e5911394b" containerName="neutron-api" containerID="cri-o://d83a1c1d4c5ccc4c5f0d50a08e3c5f8730287a5b4c216cef68a918d5fcb2e5b1" gracePeriod=30 Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.803757 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5b594ccc88-lfbdk" podUID="278a49b0-beb4-430b-ad36-065e5911394b" containerName="neutron-httpd" containerID="cri-o://3bf00faee54a5d7ebf3ec00ac8a8e039f45668051403d10301659fac663a7a4b" gracePeriod=30 Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.831965 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-zlz22" podStartSLOduration=1.8319465639999999 podStartE2EDuration="1.831946564s" podCreationTimestamp="2025-10-07 14:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:42.826974778 +0000 UTC m=+1204.654900570" watchObservedRunningTime="2025-10-07 14:13:42.831946564 +0000 UTC m=+1204.659872356" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.883744 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="249be3b9-466c-4902-b8b2-ff2ef54dd8b1" path="/var/lib/kubelet/pods/249be3b9-466c-4902-b8b2-ff2ef54dd8b1/volumes" Oct 07 14:13:42 crc kubenswrapper[4717]: I1007 14:13:42.912419 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 07 14:13:42 crc kubenswrapper[4717]: W1007 14:13:42.933237 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3351608d_f560_40bb_a1f3_c8711a80a7a4.slice/crio-c926e197482f9e24c980030ba886ece3ef3d759f313d9932d93679a681cb6f5a WatchSource:0}: Error finding container c926e197482f9e24c980030ba886ece3ef3d759f313d9932d93679a681cb6f5a: Status 404 returned error can't find the container with id c926e197482f9e24c980030ba886ece3ef3d759f313d9932d93679a681cb6f5a Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.824315 4717 generic.go:334] "Generic (PLEG): container finished" podID="0061a317-01ad-4690-a87d-8e2e6f6f3344" containerID="6ae81542ac151e06e3eb9cf5a33e8094b274215f7751d1bd9a51277ec1b63d01" exitCode=0 Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.824771 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jrb7v" event={"ID":"0061a317-01ad-4690-a87d-8e2e6f6f3344","Type":"ContainerDied","Data":"6ae81542ac151e06e3eb9cf5a33e8094b274215f7751d1bd9a51277ec1b63d01"} Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.827680 4717 generic.go:334] "Generic (PLEG): container finished" podID="3576d729-52a6-41c4-b070-64f86f9bc55b" containerID="504e95263a0df8a967d21c8e96b7da41e13d637e5d96669d2d2d78062818cc77" exitCode=0 Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.827748 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bgz59" event={"ID":"3576d729-52a6-41c4-b070-64f86f9bc55b","Type":"ContainerDied","Data":"504e95263a0df8a967d21c8e96b7da41e13d637e5d96669d2d2d78062818cc77"} Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.832731 4717 generic.go:334] "Generic (PLEG): container finished" podID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerID="25abb5e5e2cc6973a5c847f87a3010cf2ce44a9d1857ea16695ba5d21d0fb59e" exitCode=1 Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.832829 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99a3cf21-f136-426d-a894-226ba3c2d20b","Type":"ContainerDied","Data":"25abb5e5e2cc6973a5c847f87a3010cf2ce44a9d1857ea16695ba5d21d0fb59e"} Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.832989 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="ceilometer-central-agent" containerID="cri-o://90d3cb8ee609f9347cf9ac6f2259f3dcbb6dec78fdf72356936980ca6fb5fd71" gracePeriod=30 Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.833326 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="sg-core" containerID="cri-o://751cce8eab3a1d053417b258226f80ca98f6dbefe6e3a990dd5c53f57787a16a" gracePeriod=30 Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.833391 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="ceilometer-notification-agent" containerID="cri-o://34eb74cecb9e9dc300b8ec1c675d73644144336140dec5974a7e374f30691918" gracePeriod=30 Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.855459 4717 generic.go:334] "Generic (PLEG): container finished" podID="278a49b0-beb4-430b-ad36-065e5911394b" containerID="3bf00faee54a5d7ebf3ec00ac8a8e039f45668051403d10301659fac663a7a4b" exitCode=0 Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.855540 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b594ccc88-lfbdk" event={"ID":"278a49b0-beb4-430b-ad36-065e5911394b","Type":"ContainerDied","Data":"3bf00faee54a5d7ebf3ec00ac8a8e039f45668051403d10301659fac663a7a4b"} Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.868356 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zlz22" event={"ID":"f29c601f-2095-4f1f-baf8-8c79792118dd","Type":"ContainerDied","Data":"0e78793318db11adec9b8d99ee0cf901588375e520aa6f4d374137c06585184c"} Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.868358 4717 generic.go:334] "Generic (PLEG): container finished" podID="f29c601f-2095-4f1f-baf8-8c79792118dd" containerID="0e78793318db11adec9b8d99ee0cf901588375e520aa6f4d374137c06585184c" exitCode=0 Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.891965 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"3351608d-f560-40bb-a1f3-c8711a80a7a4","Type":"ContainerStarted","Data":"33c51844d600eb5a0e21d2d21179089d9e11a298abdb6f2caacf0026737b6d9f"} Oct 07 14:13:43 crc kubenswrapper[4717]: I1007 14:13:43.892272 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"3351608d-f560-40bb-a1f3-c8711a80a7a4","Type":"ContainerStarted","Data":"c926e197482f9e24c980030ba886ece3ef3d759f313d9932d93679a681cb6f5a"} Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.915824 4717 generic.go:334] "Generic (PLEG): container finished" podID="278a49b0-beb4-430b-ad36-065e5911394b" containerID="d83a1c1d4c5ccc4c5f0d50a08e3c5f8730287a5b4c216cef68a918d5fcb2e5b1" exitCode=0 Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.916057 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b594ccc88-lfbdk" event={"ID":"278a49b0-beb4-430b-ad36-065e5911394b","Type":"ContainerDied","Data":"d83a1c1d4c5ccc4c5f0d50a08e3c5f8730287a5b4c216cef68a918d5fcb2e5b1"} Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.923748 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"3351608d-f560-40bb-a1f3-c8711a80a7a4","Type":"ContainerStarted","Data":"ea73e947bf6218e587e3961827ae203ac03335839ccafc7790b15824c792dbc7"} Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.923918 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.934055 4717 generic.go:334] "Generic (PLEG): container finished" podID="1127e471-e2a5-436b-8433-c56124359062" containerID="1f7b6873101b80934ebcd8b04468facfc3035d0249009cd29f3246b3808ae2ab" exitCode=0 Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.934169 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1127e471-e2a5-436b-8433-c56124359062","Type":"ContainerDied","Data":"1f7b6873101b80934ebcd8b04468facfc3035d0249009cd29f3246b3808ae2ab"} Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.947206 4717 generic.go:334] "Generic (PLEG): container finished" podID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerID="751cce8eab3a1d053417b258226f80ca98f6dbefe6e3a990dd5c53f57787a16a" exitCode=2 Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.947245 4717 generic.go:334] "Generic (PLEG): container finished" podID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerID="34eb74cecb9e9dc300b8ec1c675d73644144336140dec5974a7e374f30691918" exitCode=0 Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.947378 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99a3cf21-f136-426d-a894-226ba3c2d20b","Type":"ContainerDied","Data":"751cce8eab3a1d053417b258226f80ca98f6dbefe6e3a990dd5c53f57787a16a"} Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.947414 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99a3cf21-f136-426d-a894-226ba3c2d20b","Type":"ContainerDied","Data":"34eb74cecb9e9dc300b8ec1c675d73644144336140dec5974a7e374f30691918"} Oct 07 14:13:44 crc kubenswrapper[4717]: I1007 14:13:44.952288 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.9522508800000002 podStartE2EDuration="3.95225088s" podCreationTimestamp="2025-10-07 14:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:44.941129676 +0000 UTC m=+1206.769055468" watchObservedRunningTime="2025-10-07 14:13:44.95225088 +0000 UTC m=+1206.780176672" Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.546532 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.693074 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.693362 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="97ea3951-1b7c-4711-87a5-3e420477d7f7" containerName="glance-log" containerID="cri-o://8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1" gracePeriod=30 Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.693456 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="97ea3951-1b7c-4711-87a5-3e420477d7f7" containerName="glance-httpd" containerID="cri-o://7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08" gracePeriod=30 Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.700560 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.765141 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-wtwxc"] Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.765436 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" podUID="26678e24-2754-454e-abdc-3add4caf4c81" containerName="dnsmasq-dns" containerID="cri-o://0c7560b6c5c12fb1a17cabe6cf83cdb6a91e53a8e5675120e7d6f638ba5cecd3" gracePeriod=10 Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.968996 4717 generic.go:334] "Generic (PLEG): container finished" podID="26678e24-2754-454e-abdc-3add4caf4c81" containerID="0c7560b6c5c12fb1a17cabe6cf83cdb6a91e53a8e5675120e7d6f638ba5cecd3" exitCode=0 Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.969113 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" event={"ID":"26678e24-2754-454e-abdc-3add4caf4c81","Type":"ContainerDied","Data":"0c7560b6c5c12fb1a17cabe6cf83cdb6a91e53a8e5675120e7d6f638ba5cecd3"} Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.972934 4717 generic.go:334] "Generic (PLEG): container finished" podID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerID="90d3cb8ee609f9347cf9ac6f2259f3dcbb6dec78fdf72356936980ca6fb5fd71" exitCode=0 Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.973041 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99a3cf21-f136-426d-a894-226ba3c2d20b","Type":"ContainerDied","Data":"90d3cb8ee609f9347cf9ac6f2259f3dcbb6dec78fdf72356936980ca6fb5fd71"} Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.977573 4717 generic.go:334] "Generic (PLEG): container finished" podID="97ea3951-1b7c-4711-87a5-3e420477d7f7" containerID="8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1" exitCode=143 Oct 07 14:13:45 crc kubenswrapper[4717]: I1007 14:13:45.977835 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97ea3951-1b7c-4711-87a5-3e420477d7f7","Type":"ContainerDied","Data":"8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1"} Oct 07 14:13:47 crc kubenswrapper[4717]: I1007 14:13:47.981277 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 07 14:13:47 crc kubenswrapper[4717]: I1007 14:13:47.998991 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jrb7v" event={"ID":"0061a317-01ad-4690-a87d-8e2e6f6f3344","Type":"ContainerDied","Data":"a2731cea081607ae33e8a9a72ed4ff450844b9cf53b0a28abe7d249f20cd8325"} Oct 07 14:13:47 crc kubenswrapper[4717]: I1007 14:13:47.999048 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2731cea081607ae33e8a9a72ed4ff450844b9cf53b0a28abe7d249f20cd8325" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.000401 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zlz22" event={"ID":"f29c601f-2095-4f1f-baf8-8c79792118dd","Type":"ContainerDied","Data":"325c6d38bdde30c656896cc356415d53027a1736c0506b71bc5bcfdd01dff3ad"} Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.000423 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="325c6d38bdde30c656896cc356415d53027a1736c0506b71bc5bcfdd01dff3ad" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.134802 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zlz22" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.172157 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jrb7v" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.176771 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.278359 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-config-data\") pod \"1127e471-e2a5-436b-8433-c56124359062\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.281545 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g69n5\" (UniqueName: \"kubernetes.io/projected/0061a317-01ad-4690-a87d-8e2e6f6f3344-kube-api-access-g69n5\") pod \"0061a317-01ad-4690-a87d-8e2e6f6f3344\" (UID: \"0061a317-01ad-4690-a87d-8e2e6f6f3344\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.281598 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"1127e471-e2a5-436b-8433-c56124359062\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.281616 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkd6h\" (UniqueName: \"kubernetes.io/projected/f29c601f-2095-4f1f-baf8-8c79792118dd-kube-api-access-fkd6h\") pod \"f29c601f-2095-4f1f-baf8-8c79792118dd\" (UID: \"f29c601f-2095-4f1f-baf8-8c79792118dd\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.281644 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-ceph\") pod \"1127e471-e2a5-436b-8433-c56124359062\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.281710 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-httpd-run\") pod \"1127e471-e2a5-436b-8433-c56124359062\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.281736 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-public-tls-certs\") pod \"1127e471-e2a5-436b-8433-c56124359062\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.281764 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-combined-ca-bundle\") pod \"1127e471-e2a5-436b-8433-c56124359062\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.281818 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-scripts\") pod \"1127e471-e2a5-436b-8433-c56124359062\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.281850 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpwxj\" (UniqueName: \"kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-kube-api-access-qpwxj\") pod \"1127e471-e2a5-436b-8433-c56124359062\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.281985 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-logs\") pod \"1127e471-e2a5-436b-8433-c56124359062\" (UID: \"1127e471-e2a5-436b-8433-c56124359062\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.287131 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1127e471-e2a5-436b-8433-c56124359062" (UID: "1127e471-e2a5-436b-8433-c56124359062"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.287442 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29c601f-2095-4f1f-baf8-8c79792118dd-kube-api-access-fkd6h" (OuterVolumeSpecName: "kube-api-access-fkd6h") pod "f29c601f-2095-4f1f-baf8-8c79792118dd" (UID: "f29c601f-2095-4f1f-baf8-8c79792118dd"). InnerVolumeSpecName "kube-api-access-fkd6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.287774 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-logs" (OuterVolumeSpecName: "logs") pod "1127e471-e2a5-436b-8433-c56124359062" (UID: "1127e471-e2a5-436b-8433-c56124359062"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.288260 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0061a317-01ad-4690-a87d-8e2e6f6f3344-kube-api-access-g69n5" (OuterVolumeSpecName: "kube-api-access-g69n5") pod "0061a317-01ad-4690-a87d-8e2e6f6f3344" (UID: "0061a317-01ad-4690-a87d-8e2e6f6f3344"). InnerVolumeSpecName "kube-api-access-g69n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.289582 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-scripts" (OuterVolumeSpecName: "scripts") pod "1127e471-e2a5-436b-8433-c56124359062" (UID: "1127e471-e2a5-436b-8433-c56124359062"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.293359 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-ceph" (OuterVolumeSpecName: "ceph") pod "1127e471-e2a5-436b-8433-c56124359062" (UID: "1127e471-e2a5-436b-8433-c56124359062"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.294838 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-kube-api-access-qpwxj" (OuterVolumeSpecName: "kube-api-access-qpwxj") pod "1127e471-e2a5-436b-8433-c56124359062" (UID: "1127e471-e2a5-436b-8433-c56124359062"). InnerVolumeSpecName "kube-api-access-qpwxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.304991 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "1127e471-e2a5-436b-8433-c56124359062" (UID: "1127e471-e2a5-436b-8433-c56124359062"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.308850 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bgz59" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.330109 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1127e471-e2a5-436b-8433-c56124359062" (UID: "1127e471-e2a5-436b-8433-c56124359062"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.369550 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1127e471-e2a5-436b-8433-c56124359062" (UID: "1127e471-e2a5-436b-8433-c56124359062"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.383452 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p768z\" (UniqueName: \"kubernetes.io/projected/3576d729-52a6-41c4-b070-64f86f9bc55b-kube-api-access-p768z\") pod \"3576d729-52a6-41c4-b070-64f86f9bc55b\" (UID: \"3576d729-52a6-41c4-b070-64f86f9bc55b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.383922 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.383948 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.383961 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.383974 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.383987 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpwxj\" (UniqueName: \"kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-kube-api-access-qpwxj\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.384000 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1127e471-e2a5-436b-8433-c56124359062-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.384030 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g69n5\" (UniqueName: \"kubernetes.io/projected/0061a317-01ad-4690-a87d-8e2e6f6f3344-kube-api-access-g69n5\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.384059 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.384074 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkd6h\" (UniqueName: \"kubernetes.io/projected/f29c601f-2095-4f1f-baf8-8c79792118dd-kube-api-access-fkd6h\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.384085 4717 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1127e471-e2a5-436b-8433-c56124359062-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.396976 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3576d729-52a6-41c4-b070-64f86f9bc55b-kube-api-access-p768z" (OuterVolumeSpecName: "kube-api-access-p768z") pod "3576d729-52a6-41c4-b070-64f86f9bc55b" (UID: "3576d729-52a6-41c4-b070-64f86f9bc55b"). InnerVolumeSpecName "kube-api-access-p768z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.397890 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-config-data" (OuterVolumeSpecName: "config-data") pod "1127e471-e2a5-436b-8433-c56124359062" (UID: "1127e471-e2a5-436b-8433-c56124359062"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.406742 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.472333 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.485383 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-scripts\") pod \"99a3cf21-f136-426d-a894-226ba3c2d20b\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.485425 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd9gk\" (UniqueName: \"kubernetes.io/projected/99a3cf21-f136-426d-a894-226ba3c2d20b-kube-api-access-gd9gk\") pod \"99a3cf21-f136-426d-a894-226ba3c2d20b\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.485446 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-sg-core-conf-yaml\") pod \"99a3cf21-f136-426d-a894-226ba3c2d20b\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.485537 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-config-data\") pod \"99a3cf21-f136-426d-a894-226ba3c2d20b\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.485571 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-log-httpd\") pod \"99a3cf21-f136-426d-a894-226ba3c2d20b\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.485602 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-run-httpd\") pod \"99a3cf21-f136-426d-a894-226ba3c2d20b\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.485719 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-combined-ca-bundle\") pod \"99a3cf21-f136-426d-a894-226ba3c2d20b\" (UID: \"99a3cf21-f136-426d-a894-226ba3c2d20b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.486282 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p768z\" (UniqueName: \"kubernetes.io/projected/3576d729-52a6-41c4-b070-64f86f9bc55b-kube-api-access-p768z\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.486311 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1127e471-e2a5-436b-8433-c56124359062-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.486324 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.486887 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "99a3cf21-f136-426d-a894-226ba3c2d20b" (UID: "99a3cf21-f136-426d-a894-226ba3c2d20b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.488662 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "99a3cf21-f136-426d-a894-226ba3c2d20b" (UID: "99a3cf21-f136-426d-a894-226ba3c2d20b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.500926 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a3cf21-f136-426d-a894-226ba3c2d20b-kube-api-access-gd9gk" (OuterVolumeSpecName: "kube-api-access-gd9gk") pod "99a3cf21-f136-426d-a894-226ba3c2d20b" (UID: "99a3cf21-f136-426d-a894-226ba3c2d20b"). InnerVolumeSpecName "kube-api-access-gd9gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.506305 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-scripts" (OuterVolumeSpecName: "scripts") pod "99a3cf21-f136-426d-a894-226ba3c2d20b" (UID: "99a3cf21-f136-426d-a894-226ba3c2d20b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.543572 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "99a3cf21-f136-426d-a894-226ba3c2d20b" (UID: "99a3cf21-f136-426d-a894-226ba3c2d20b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.590687 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.590802 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/99a3cf21-f136-426d-a894-226ba3c2d20b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.590817 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.590828 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd9gk\" (UniqueName: \"kubernetes.io/projected/99a3cf21-f136-426d-a894-226ba3c2d20b-kube-api-access-gd9gk\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.590854 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.594933 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99a3cf21-f136-426d-a894-226ba3c2d20b" (UID: "99a3cf21-f136-426d-a894-226ba3c2d20b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.640241 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-config-data" (OuterVolumeSpecName: "config-data") pod "99a3cf21-f136-426d-a894-226ba3c2d20b" (UID: "99a3cf21-f136-426d-a894-226ba3c2d20b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.683961 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.686304 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.693200 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.693234 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a3cf21-f136-426d-a894-226ba3c2d20b-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.796847 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhr9r\" (UniqueName: \"kubernetes.io/projected/278a49b0-beb4-430b-ad36-065e5911394b-kube-api-access-xhr9r\") pod \"278a49b0-beb4-430b-ad36-065e5911394b\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.796910 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-config\") pod \"278a49b0-beb4-430b-ad36-065e5911394b\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.796966 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-sb\") pod \"26678e24-2754-454e-abdc-3add4caf4c81\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.797145 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95vxf\" (UniqueName: \"kubernetes.io/projected/26678e24-2754-454e-abdc-3add4caf4c81-kube-api-access-95vxf\") pod \"26678e24-2754-454e-abdc-3add4caf4c81\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.797171 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-config\") pod \"26678e24-2754-454e-abdc-3add4caf4c81\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.797198 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-ovndb-tls-certs\") pod \"278a49b0-beb4-430b-ad36-065e5911394b\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.797227 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-nb\") pod \"26678e24-2754-454e-abdc-3add4caf4c81\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.797269 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-combined-ca-bundle\") pod \"278a49b0-beb4-430b-ad36-065e5911394b\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.797302 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-httpd-config\") pod \"278a49b0-beb4-430b-ad36-065e5911394b\" (UID: \"278a49b0-beb4-430b-ad36-065e5911394b\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.797336 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-swift-storage-0\") pod \"26678e24-2754-454e-abdc-3add4caf4c81\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.797358 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-svc\") pod \"26678e24-2754-454e-abdc-3add4caf4c81\" (UID: \"26678e24-2754-454e-abdc-3add4caf4c81\") " Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.806228 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26678e24-2754-454e-abdc-3add4caf4c81-kube-api-access-95vxf" (OuterVolumeSpecName: "kube-api-access-95vxf") pod "26678e24-2754-454e-abdc-3add4caf4c81" (UID: "26678e24-2754-454e-abdc-3add4caf4c81"). InnerVolumeSpecName "kube-api-access-95vxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.810646 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "278a49b0-beb4-430b-ad36-065e5911394b" (UID: "278a49b0-beb4-430b-ad36-065e5911394b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.811989 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/278a49b0-beb4-430b-ad36-065e5911394b-kube-api-access-xhr9r" (OuterVolumeSpecName: "kube-api-access-xhr9r") pod "278a49b0-beb4-430b-ad36-065e5911394b" (UID: "278a49b0-beb4-430b-ad36-065e5911394b"). InnerVolumeSpecName "kube-api-access-xhr9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.894620 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-config" (OuterVolumeSpecName: "config") pod "26678e24-2754-454e-abdc-3add4caf4c81" (UID: "26678e24-2754-454e-abdc-3add4caf4c81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.897636 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "26678e24-2754-454e-abdc-3add4caf4c81" (UID: "26678e24-2754-454e-abdc-3add4caf4c81"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.899931 4717 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod4c971bcc-76c9-43c4-9819-de5e25ed2a3f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod4c971bcc-76c9-43c4-9819-de5e25ed2a3f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod4c971bcc_76c9_43c4_9819_de5e25ed2a3f.slice" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.900426 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95vxf\" (UniqueName: \"kubernetes.io/projected/26678e24-2754-454e-abdc-3add4caf4c81-kube-api-access-95vxf\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.900461 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.900475 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.900486 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhr9r\" (UniqueName: \"kubernetes.io/projected/278a49b0-beb4-430b-ad36-065e5911394b-kube-api-access-xhr9r\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.900495 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.914741 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "26678e24-2754-454e-abdc-3add4caf4c81" (UID: "26678e24-2754-454e-abdc-3add4caf4c81"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.917875 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "278a49b0-beb4-430b-ad36-065e5911394b" (UID: "278a49b0-beb4-430b-ad36-065e5911394b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.924528 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "26678e24-2754-454e-abdc-3add4caf4c81" (UID: "26678e24-2754-454e-abdc-3add4caf4c81"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.930582 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "26678e24-2754-454e-abdc-3add4caf4c81" (UID: "26678e24-2754-454e-abdc-3add4caf4c81"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.959536 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-config" (OuterVolumeSpecName: "config") pod "278a49b0-beb4-430b-ad36-065e5911394b" (UID: "278a49b0-beb4-430b-ad36-065e5911394b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:48 crc kubenswrapper[4717]: I1007 14:13:48.974452 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "278a49b0-beb4-430b-ad36-065e5911394b" (UID: "278a49b0-beb4-430b-ad36-065e5911394b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.001945 4717 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.001976 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.001986 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.001995 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.002019 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/26678e24-2754-454e-abdc-3add4caf4c81-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.002030 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/278a49b0-beb4-430b-ad36-065e5911394b-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.012293 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.015118 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bgz59" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.019130 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.028198 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.037342 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zlz22" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.040219 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b594ccc88-lfbdk" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.041332 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jrb7v" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.209161 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-wtwxc" event={"ID":"26678e24-2754-454e-abdc-3add4caf4c81","Type":"ContainerDied","Data":"d555c375a738d298a5a3812493b00490dc9e44026b11f6d424213b1096393e3c"} Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.209225 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bgz59" event={"ID":"3576d729-52a6-41c4-b070-64f86f9bc55b","Type":"ContainerDied","Data":"1be0b2eb5adb177178451b9dc23e5efc99a2b1d9ea19b48442d217f3b5fc748a"} Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.209243 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1be0b2eb5adb177178451b9dc23e5efc99a2b1d9ea19b48442d217f3b5fc748a" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.209257 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1127e471-e2a5-436b-8433-c56124359062","Type":"ContainerDied","Data":"0333526c7eb99b8b7c23896da2b98c249df98690b571453237c001843015d588"} Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.209274 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"99a3cf21-f136-426d-a894-226ba3c2d20b","Type":"ContainerDied","Data":"a39b811b7d965f57ff59c8570d76d6dc6ecf7b46f7e7a7f8317183554424430c"} Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.209290 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b594ccc88-lfbdk" event={"ID":"278a49b0-beb4-430b-ad36-065e5911394b","Type":"ContainerDied","Data":"5ba0997fa009baec2d8316784fd7b2a375dd71fdb1ae430a78c79353036adade"} Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.209312 4717 scope.go:117] "RemoveContainer" containerID="0c7560b6c5c12fb1a17cabe6cf83cdb6a91e53a8e5675120e7d6f638ba5cecd3" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.288727 4717 scope.go:117] "RemoveContainer" containerID="171d45b0003dbaab76e8a3832a18db738dd7399350976176a2dd79b1efa38fce" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.320162 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.329595 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.371147 4717 scope.go:117] "RemoveContainer" containerID="1f7b6873101b80934ebcd8b04468facfc3035d0249009cd29f3246b3808ae2ab" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.385801 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386215 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="sg-core" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386232 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="sg-core" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386247 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="ceilometer-central-agent" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386254 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="ceilometer-central-agent" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386271 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="ceilometer-notification-agent" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386277 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="ceilometer-notification-agent" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386287 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="proxy-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386293 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="proxy-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386304 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1127e471-e2a5-436b-8433-c56124359062" containerName="glance-log" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386311 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1127e471-e2a5-436b-8433-c56124359062" containerName="glance-log" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386321 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278a49b0-beb4-430b-ad36-065e5911394b" containerName="neutron-api" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386327 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="278a49b0-beb4-430b-ad36-065e5911394b" containerName="neutron-api" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386334 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29c601f-2095-4f1f-baf8-8c79792118dd" containerName="mariadb-database-create" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386339 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29c601f-2095-4f1f-baf8-8c79792118dd" containerName="mariadb-database-create" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386352 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26678e24-2754-454e-abdc-3add4caf4c81" containerName="dnsmasq-dns" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386357 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="26678e24-2754-454e-abdc-3add4caf4c81" containerName="dnsmasq-dns" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386375 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0061a317-01ad-4690-a87d-8e2e6f6f3344" containerName="mariadb-database-create" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386381 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="0061a317-01ad-4690-a87d-8e2e6f6f3344" containerName="mariadb-database-create" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386394 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3576d729-52a6-41c4-b070-64f86f9bc55b" containerName="mariadb-database-create" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386400 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3576d729-52a6-41c4-b070-64f86f9bc55b" containerName="mariadb-database-create" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386409 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="278a49b0-beb4-430b-ad36-065e5911394b" containerName="neutron-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386415 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="278a49b0-beb4-430b-ad36-065e5911394b" containerName="neutron-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386430 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1127e471-e2a5-436b-8433-c56124359062" containerName="glance-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386435 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1127e471-e2a5-436b-8433-c56124359062" containerName="glance-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.386441 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26678e24-2754-454e-abdc-3add4caf4c81" containerName="init" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386447 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="26678e24-2754-454e-abdc-3add4caf4c81" containerName="init" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386620 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1127e471-e2a5-436b-8433-c56124359062" containerName="glance-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386633 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="sg-core" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386642 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="278a49b0-beb4-430b-ad36-065e5911394b" containerName="neutron-api" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386652 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="ceilometer-central-agent" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386663 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="0061a317-01ad-4690-a87d-8e2e6f6f3344" containerName="mariadb-database-create" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386673 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1127e471-e2a5-436b-8433-c56124359062" containerName="glance-log" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386680 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3576d729-52a6-41c4-b070-64f86f9bc55b" containerName="mariadb-database-create" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386691 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29c601f-2095-4f1f-baf8-8c79792118dd" containerName="mariadb-database-create" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386698 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="26678e24-2754-454e-abdc-3add4caf4c81" containerName="dnsmasq-dns" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386708 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="278a49b0-beb4-430b-ad36-065e5911394b" containerName="neutron-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386718 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="proxy-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.386728 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" containerName="ceilometer-notification-agent" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.387653 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.393075 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.393283 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.422716 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5b594ccc88-lfbdk"] Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.439681 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5b594ccc88-lfbdk"] Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.465204 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.509602 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-wtwxc"] Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.524529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.524584 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.524711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7bj\" (UniqueName: \"kubernetes.io/projected/c19f33db-3446-415d-8d32-8f18ac112e2e-kube-api-access-hg7bj\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.524746 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c19f33db-3446-415d-8d32-8f18ac112e2e-ceph\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.524772 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.524808 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19f33db-3446-415d-8d32-8f18ac112e2e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.524831 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.524888 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19f33db-3446-415d-8d32-8f18ac112e2e-logs\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.524908 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.527123 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-wtwxc"] Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.537609 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.543206 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.548343 4717 scope.go:117] "RemoveContainer" containerID="0f04054b9015786809c1f294f2e557e2875402fdf4d407d2b423c550565c207a" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.552847 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.586205 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.586803 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ea3951-1b7c-4711-87a5-3e420477d7f7" containerName="glance-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.586823 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ea3951-1b7c-4711-87a5-3e420477d7f7" containerName="glance-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: E1007 14:13:49.586846 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ea3951-1b7c-4711-87a5-3e420477d7f7" containerName="glance-log" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.586854 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ea3951-1b7c-4711-87a5-3e420477d7f7" containerName="glance-log" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.587072 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ea3951-1b7c-4711-87a5-3e420477d7f7" containerName="glance-log" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.587090 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ea3951-1b7c-4711-87a5-3e420477d7f7" containerName="glance-httpd" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.593141 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.598148 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.600211 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.600610 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.603294 4717 scope.go:117] "RemoveContainer" containerID="25abb5e5e2cc6973a5c847f87a3010cf2ce44a9d1857ea16695ba5d21d0fb59e" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.603578 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627151 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-config-data\") pod \"97ea3951-1b7c-4711-87a5-3e420477d7f7\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627279 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-ceph\") pod \"97ea3951-1b7c-4711-87a5-3e420477d7f7\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627321 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"97ea3951-1b7c-4711-87a5-3e420477d7f7\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627406 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-internal-tls-certs\") pod \"97ea3951-1b7c-4711-87a5-3e420477d7f7\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627437 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-logs\") pod \"97ea3951-1b7c-4711-87a5-3e420477d7f7\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627487 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2zzh\" (UniqueName: \"kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-kube-api-access-m2zzh\") pod \"97ea3951-1b7c-4711-87a5-3e420477d7f7\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627511 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-httpd-run\") pod \"97ea3951-1b7c-4711-87a5-3e420477d7f7\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627558 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-scripts\") pod \"97ea3951-1b7c-4711-87a5-3e420477d7f7\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627582 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-combined-ca-bundle\") pod \"97ea3951-1b7c-4711-87a5-3e420477d7f7\" (UID: \"97ea3951-1b7c-4711-87a5-3e420477d7f7\") " Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627882 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19f33db-3446-415d-8d32-8f18ac112e2e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627912 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627957 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19f33db-3446-415d-8d32-8f18ac112e2e-logs\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.627983 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.628503 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.628566 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.628646 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7bj\" (UniqueName: \"kubernetes.io/projected/c19f33db-3446-415d-8d32-8f18ac112e2e-kube-api-access-hg7bj\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.628770 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c19f33db-3446-415d-8d32-8f18ac112e2e-ceph\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.628796 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.635762 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.636140 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.636154 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-logs" (OuterVolumeSpecName: "logs") pod "97ea3951-1b7c-4711-87a5-3e420477d7f7" (UID: "97ea3951-1b7c-4711-87a5-3e420477d7f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.641087 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "97ea3951-1b7c-4711-87a5-3e420477d7f7" (UID: "97ea3951-1b7c-4711-87a5-3e420477d7f7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.643910 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c19f33db-3446-415d-8d32-8f18ac112e2e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.648178 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c19f33db-3446-415d-8d32-8f18ac112e2e-logs\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.652162 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7bj\" (UniqueName: \"kubernetes.io/projected/c19f33db-3446-415d-8d32-8f18ac112e2e-kube-api-access-hg7bj\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.654571 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-scripts" (OuterVolumeSpecName: "scripts") pod "97ea3951-1b7c-4711-87a5-3e420477d7f7" (UID: "97ea3951-1b7c-4711-87a5-3e420477d7f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.655652 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c19f33db-3446-415d-8d32-8f18ac112e2e-ceph\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.660885 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "97ea3951-1b7c-4711-87a5-3e420477d7f7" (UID: "97ea3951-1b7c-4711-87a5-3e420477d7f7"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.661306 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.661500 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.664132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c19f33db-3446-415d-8d32-8f18ac112e2e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.670183 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-kube-api-access-m2zzh" (OuterVolumeSpecName: "kube-api-access-m2zzh") pod "97ea3951-1b7c-4711-87a5-3e420477d7f7" (UID: "97ea3951-1b7c-4711-87a5-3e420477d7f7"). InnerVolumeSpecName "kube-api-access-m2zzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.671352 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-ceph" (OuterVolumeSpecName: "ceph") pod "97ea3951-1b7c-4711-87a5-3e420477d7f7" (UID: "97ea3951-1b7c-4711-87a5-3e420477d7f7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.690523 4717 scope.go:117] "RemoveContainer" containerID="751cce8eab3a1d053417b258226f80ca98f6dbefe6e3a990dd5c53f57787a16a" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.705312 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"c19f33db-3446-415d-8d32-8f18ac112e2e\") " pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.710214 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97ea3951-1b7c-4711-87a5-3e420477d7f7" (UID: "97ea3951-1b7c-4711-87a5-3e420477d7f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.712287 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "97ea3951-1b7c-4711-87a5-3e420477d7f7" (UID: "97ea3951-1b7c-4711-87a5-3e420477d7f7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.718279 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-config-data" (OuterVolumeSpecName: "config-data") pod "97ea3951-1b7c-4711-87a5-3e420477d7f7" (UID: "97ea3951-1b7c-4711-87a5-3e420477d7f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730397 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-run-httpd\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730433 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-scripts\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730450 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730512 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjxwl\" (UniqueName: \"kubernetes.io/projected/3ed9670b-92c3-406b-b8d6-c778a73ed735-kube-api-access-cjxwl\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730530 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-log-httpd\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730583 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730706 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-config-data\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730735 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730823 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730834 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730845 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730853 4717 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.730872 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.731433 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97ea3951-1b7c-4711-87a5-3e420477d7f7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.731447 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.731456 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2zzh\" (UniqueName: \"kubernetes.io/projected/97ea3951-1b7c-4711-87a5-3e420477d7f7-kube-api-access-m2zzh\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.731466 4717 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97ea3951-1b7c-4711-87a5-3e420477d7f7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.768426 4717 scope.go:117] "RemoveContainer" containerID="34eb74cecb9e9dc300b8ec1c675d73644144336140dec5974a7e374f30691918" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.773915 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.809197 4717 scope.go:117] "RemoveContainer" containerID="90d3cb8ee609f9347cf9ac6f2259f3dcbb6dec78fdf72356936980ca6fb5fd71" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.833435 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-config-data\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.833499 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.833575 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-run-httpd\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.833598 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-scripts\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.833616 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.833631 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjxwl\" (UniqueName: \"kubernetes.io/projected/3ed9670b-92c3-406b-b8d6-c778a73ed735-kube-api-access-cjxwl\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.833645 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-log-httpd\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.833670 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.833747 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.833753 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.837228 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-run-httpd\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.837276 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.837293 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-log-httpd\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.837877 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-config-data\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.838576 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-scripts\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.839948 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.844494 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.847255 4717 scope.go:117] "RemoveContainer" containerID="3bf00faee54a5d7ebf3ec00ac8a8e039f45668051403d10301659fac663a7a4b" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.855345 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjxwl\" (UniqueName: \"kubernetes.io/projected/3ed9670b-92c3-406b-b8d6-c778a73ed735-kube-api-access-cjxwl\") pod \"ceilometer-0\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " pod="openstack/ceilometer-0" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.887442 4717 scope.go:117] "RemoveContainer" containerID="d83a1c1d4c5ccc4c5f0d50a08e3c5f8730287a5b4c216cef68a918d5fcb2e5b1" Oct 07 14:13:49 crc kubenswrapper[4717]: I1007 14:13:49.922312 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.063067 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce","Type":"ContainerStarted","Data":"ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22"} Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.079170 4717 generic.go:334] "Generic (PLEG): container finished" podID="97ea3951-1b7c-4711-87a5-3e420477d7f7" containerID="7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08" exitCode=0 Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.079313 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.079318 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97ea3951-1b7c-4711-87a5-3e420477d7f7","Type":"ContainerDied","Data":"7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08"} Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.083784 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97ea3951-1b7c-4711-87a5-3e420477d7f7","Type":"ContainerDied","Data":"33e3d1605d165cf872c4a9c2afb6ef5800f5887e7cffc483ec6fe21360b1017f"} Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.083818 4717 scope.go:117] "RemoveContainer" containerID="7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.112751 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.204926 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.218880 4717 scope.go:117] "RemoveContainer" containerID="8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.221336 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.227540 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.231819 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.236562 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.236846 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.237709 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.265307 4717 scope.go:117] "RemoveContainer" containerID="7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08" Oct 07 14:13:50 crc kubenswrapper[4717]: E1007 14:13:50.266213 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08\": container with ID starting with 7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08 not found: ID does not exist" containerID="7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.266279 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08"} err="failed to get container status \"7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08\": rpc error: code = NotFound desc = could not find container \"7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08\": container with ID starting with 7cccb99298d858333c2fc14200a4e3a00b884c2bdde759bc2900ea468ae5ab08 not found: ID does not exist" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.266312 4717 scope.go:117] "RemoveContainer" containerID="8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1" Oct 07 14:13:50 crc kubenswrapper[4717]: E1007 14:13:50.278834 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1\": container with ID starting with 8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1 not found: ID does not exist" containerID="8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.278914 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1"} err="failed to get container status \"8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1\": rpc error: code = NotFound desc = could not find container \"8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1\": container with ID starting with 8d101a322e027916c72c7a195e1730ef72dd2266496edd2ab900996cb6662db1 not found: ID does not exist" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.346418 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.346467 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.346516 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b5f3e51b-cc26-48b7-ace9-206022bfc021-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.346591 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f3e51b-cc26-48b7-ace9-206022bfc021-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.346628 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.346657 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.346686 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.346731 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5f3e51b-cc26-48b7-ace9-206022bfc021-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.346768 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjt7g\" (UniqueName: \"kubernetes.io/projected/b5f3e51b-cc26-48b7-ace9-206022bfc021-kube-api-access-fjt7g\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.432640 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.447860 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f3e51b-cc26-48b7-ace9-206022bfc021-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.447901 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.447924 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.447944 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.447977 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5f3e51b-cc26-48b7-ace9-206022bfc021-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.448021 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjt7g\" (UniqueName: \"kubernetes.io/projected/b5f3e51b-cc26-48b7-ace9-206022bfc021-kube-api-access-fjt7g\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.448068 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.448083 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.448121 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b5f3e51b-cc26-48b7-ace9-206022bfc021-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.449526 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5f3e51b-cc26-48b7-ace9-206022bfc021-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.451124 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5f3e51b-cc26-48b7-ace9-206022bfc021-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.451379 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.455809 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.456553 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b5f3e51b-cc26-48b7-ace9-206022bfc021-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.459637 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.463699 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.468175 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjt7g\" (UniqueName: \"kubernetes.io/projected/b5f3e51b-cc26-48b7-ace9-206022bfc021-kube-api-access-fjt7g\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.474591 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5f3e51b-cc26-48b7-ace9-206022bfc021-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.503607 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5f3e51b-cc26-48b7-ace9-206022bfc021\") " pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.552456 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.624788 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:13:50 crc kubenswrapper[4717]: W1007 14:13:50.655953 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ed9670b_92c3_406b_b8d6_c778a73ed735.slice/crio-4993a830ae77bbd84473a3c2887fb95b617676533ab7b92d376b9606ebed9e1e WatchSource:0}: Error finding container 4993a830ae77bbd84473a3c2887fb95b617676533ab7b92d376b9606ebed9e1e: Status 404 returned error can't find the container with id 4993a830ae77bbd84473a3c2887fb95b617676533ab7b92d376b9606ebed9e1e Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.885149 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1127e471-e2a5-436b-8433-c56124359062" path="/var/lib/kubelet/pods/1127e471-e2a5-436b-8433-c56124359062/volumes" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.887096 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26678e24-2754-454e-abdc-3add4caf4c81" path="/var/lib/kubelet/pods/26678e24-2754-454e-abdc-3add4caf4c81/volumes" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.887909 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="278a49b0-beb4-430b-ad36-065e5911394b" path="/var/lib/kubelet/pods/278a49b0-beb4-430b-ad36-065e5911394b/volumes" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.889364 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97ea3951-1b7c-4711-87a5-3e420477d7f7" path="/var/lib/kubelet/pods/97ea3951-1b7c-4711-87a5-3e420477d7f7/volumes" Oct 07 14:13:50 crc kubenswrapper[4717]: I1007 14:13:50.889977 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a3cf21-f136-426d-a894-226ba3c2d20b" path="/var/lib/kubelet/pods/99a3cf21-f136-426d-a894-226ba3c2d20b/volumes" Oct 07 14:13:51 crc kubenswrapper[4717]: I1007 14:13:51.122345 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 07 14:13:51 crc kubenswrapper[4717]: I1007 14:13:51.138977 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce","Type":"ContainerStarted","Data":"a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2"} Oct 07 14:13:51 crc kubenswrapper[4717]: I1007 14:13:51.141051 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5f3e51b-cc26-48b7-ace9-206022bfc021","Type":"ContainerStarted","Data":"82100468ca23194f3b7da4d3dc8a4b6a822570ef2855486562afb977b7cb4ba5"} Oct 07 14:13:51 crc kubenswrapper[4717]: I1007 14:13:51.147257 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c19f33db-3446-415d-8d32-8f18ac112e2e","Type":"ContainerStarted","Data":"b535ecf125da97eee40d36b07986b4babda79cdd11af16f579084d7fe94d3858"} Oct 07 14:13:51 crc kubenswrapper[4717]: I1007 14:13:51.149608 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ed9670b-92c3-406b-b8d6-c778a73ed735","Type":"ContainerStarted","Data":"4993a830ae77bbd84473a3c2887fb95b617676533ab7b92d376b9606ebed9e1e"} Oct 07 14:13:51 crc kubenswrapper[4717]: I1007 14:13:51.161602 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.327007603 podStartE2EDuration="16.161584473s" podCreationTimestamp="2025-10-07 14:13:35 +0000 UTC" firstStartedPulling="2025-10-07 14:13:36.319862894 +0000 UTC m=+1198.147788686" lastFinishedPulling="2025-10-07 14:13:48.154439764 +0000 UTC m=+1209.982365556" observedRunningTime="2025-10-07 14:13:51.16145025 +0000 UTC m=+1212.989376042" watchObservedRunningTime="2025-10-07 14:13:51.161584473 +0000 UTC m=+1212.989510265" Oct 07 14:13:52 crc kubenswrapper[4717]: I1007 14:13:52.177506 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ed9670b-92c3-406b-b8d6-c778a73ed735","Type":"ContainerStarted","Data":"fea139a5dca934798da1936cba7e8d6ec06e506e509ec62423e472e772639ab0"} Oct 07 14:13:52 crc kubenswrapper[4717]: I1007 14:13:52.180238 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5f3e51b-cc26-48b7-ace9-206022bfc021","Type":"ContainerStarted","Data":"a081f9e42b304233a773ccb2165c858e00163614fbd39d661604c7f1b4f52762"} Oct 07 14:13:52 crc kubenswrapper[4717]: I1007 14:13:52.182477 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c19f33db-3446-415d-8d32-8f18ac112e2e","Type":"ContainerStarted","Data":"93dd423d8e025bf8c32f8df3a14e997b2b9cc1e5166908406672c666e8bbe945"} Oct 07 14:13:52 crc kubenswrapper[4717]: I1007 14:13:52.182516 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c19f33db-3446-415d-8d32-8f18ac112e2e","Type":"ContainerStarted","Data":"4606d5938a30f6ac1bf4d68aa81efbec1dd0792ffc2c228b13d903a8a47915f6"} Oct 07 14:13:52 crc kubenswrapper[4717]: I1007 14:13:52.205143 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.205125091 podStartE2EDuration="3.205125091s" podCreationTimestamp="2025-10-07 14:13:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:52.200568636 +0000 UTC m=+1214.028494438" watchObservedRunningTime="2025-10-07 14:13:52.205125091 +0000 UTC m=+1214.033050883" Oct 07 14:13:53 crc kubenswrapper[4717]: I1007 14:13:53.192940 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ed9670b-92c3-406b-b8d6-c778a73ed735","Type":"ContainerStarted","Data":"0c54e0edc69554f35acfcb2ebafe30c8518631ec9628c3c462a193532b6b942f"} Oct 07 14:13:53 crc kubenswrapper[4717]: I1007 14:13:53.197921 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5f3e51b-cc26-48b7-ace9-206022bfc021","Type":"ContainerStarted","Data":"6d33081abb61f02cccfc3dab3219e4441350af687e698b3ff98be60f6c1a503a"} Oct 07 14:13:53 crc kubenswrapper[4717]: I1007 14:13:53.220103 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.220086555 podStartE2EDuration="3.220086555s" podCreationTimestamp="2025-10-07 14:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:13:53.214451541 +0000 UTC m=+1215.042377343" watchObservedRunningTime="2025-10-07 14:13:53.220086555 +0000 UTC m=+1215.048012347" Oct 07 14:13:54 crc kubenswrapper[4717]: I1007 14:13:54.207226 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ed9670b-92c3-406b-b8d6-c778a73ed735","Type":"ContainerStarted","Data":"408d1438c122b7b87b0c7e20190cc0c5a562cce378005c79b07a33b4633065d8"} Oct 07 14:13:55 crc kubenswrapper[4717]: I1007 14:13:55.577842 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 07 14:13:57 crc kubenswrapper[4717]: I1007 14:13:57.118974 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 07 14:13:57 crc kubenswrapper[4717]: I1007 14:13:57.153821 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:13:57 crc kubenswrapper[4717]: I1007 14:13:57.231441 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" containerName="manila-scheduler" containerID="cri-o://a118246ca00080d5080df0ce6cdabe2cf683953d0dc337fcc159ab1e4e95d4c6" gracePeriod=30 Oct 07 14:13:57 crc kubenswrapper[4717]: I1007 14:13:57.231543 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" containerName="probe" containerID="cri-o://3d04e1bbbd495dfc2b9b5c228bde2d5852a718544d4f656b78477b9512c2e905" gracePeriod=30 Oct 07 14:13:58 crc kubenswrapper[4717]: I1007 14:13:58.248630 4717 generic.go:334] "Generic (PLEG): container finished" podID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" containerID="3d04e1bbbd495dfc2b9b5c228bde2d5852a718544d4f656b78477b9512c2e905" exitCode=0 Oct 07 14:13:58 crc kubenswrapper[4717]: I1007 14:13:58.248730 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8239b76e-540d-4a4e-bab7-889cd9ee9d3f","Type":"ContainerDied","Data":"3d04e1bbbd495dfc2b9b5c228bde2d5852a718544d4f656b78477b9512c2e905"} Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.259117 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ed9670b-92c3-406b-b8d6-c778a73ed735","Type":"ContainerStarted","Data":"77a29cbac9d665a3411f9e8bfef9036efdf829f8c312085ecc76bbe2b50fe197"} Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.259267 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="ceilometer-central-agent" containerID="cri-o://fea139a5dca934798da1936cba7e8d6ec06e506e509ec62423e472e772639ab0" gracePeriod=30 Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.259406 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.259380 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="sg-core" containerID="cri-o://408d1438c122b7b87b0c7e20190cc0c5a562cce378005c79b07a33b4633065d8" gracePeriod=30 Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.259432 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="proxy-httpd" containerID="cri-o://77a29cbac9d665a3411f9e8bfef9036efdf829f8c312085ecc76bbe2b50fe197" gracePeriod=30 Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.259526 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="ceilometer-notification-agent" containerID="cri-o://0c54e0edc69554f35acfcb2ebafe30c8518631ec9628c3c462a193532b6b942f" gracePeriod=30 Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.291408 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.830628668 podStartE2EDuration="10.291387216s" podCreationTimestamp="2025-10-07 14:13:49 +0000 UTC" firstStartedPulling="2025-10-07 14:13:50.671831342 +0000 UTC m=+1212.499757134" lastFinishedPulling="2025-10-07 14:13:58.13258989 +0000 UTC m=+1219.960515682" observedRunningTime="2025-10-07 14:13:59.283181701 +0000 UTC m=+1221.111107493" watchObservedRunningTime="2025-10-07 14:13:59.291387216 +0000 UTC m=+1221.119313008" Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.834817 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.834880 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.868428 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 14:13:59 crc kubenswrapper[4717]: I1007 14:13:59.876363 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.272538 4717 generic.go:334] "Generic (PLEG): container finished" podID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerID="77a29cbac9d665a3411f9e8bfef9036efdf829f8c312085ecc76bbe2b50fe197" exitCode=0 Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.272922 4717 generic.go:334] "Generic (PLEG): container finished" podID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerID="408d1438c122b7b87b0c7e20190cc0c5a562cce378005c79b07a33b4633065d8" exitCode=2 Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.272933 4717 generic.go:334] "Generic (PLEG): container finished" podID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerID="0c54e0edc69554f35acfcb2ebafe30c8518631ec9628c3c462a193532b6b942f" exitCode=0 Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.272602 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ed9670b-92c3-406b-b8d6-c778a73ed735","Type":"ContainerDied","Data":"77a29cbac9d665a3411f9e8bfef9036efdf829f8c312085ecc76bbe2b50fe197"} Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.273101 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ed9670b-92c3-406b-b8d6-c778a73ed735","Type":"ContainerDied","Data":"408d1438c122b7b87b0c7e20190cc0c5a562cce378005c79b07a33b4633065d8"} Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.273124 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ed9670b-92c3-406b-b8d6-c778a73ed735","Type":"ContainerDied","Data":"0c54e0edc69554f35acfcb2ebafe30c8518631ec9628c3c462a193532b6b942f"} Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.275236 4717 generic.go:334] "Generic (PLEG): container finished" podID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" containerID="a118246ca00080d5080df0ce6cdabe2cf683953d0dc337fcc159ab1e4e95d4c6" exitCode=0 Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.275554 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8239b76e-540d-4a4e-bab7-889cd9ee9d3f","Type":"ContainerDied","Data":"a118246ca00080d5080df0ce6cdabe2cf683953d0dc337fcc159ab1e4e95d4c6"} Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.275911 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.276212 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.554131 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.556489 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.570778 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.588472 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.600665 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.634546 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data\") pod \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.634654 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-combined-ca-bundle\") pod \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.634778 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tns6\" (UniqueName: \"kubernetes.io/projected/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-kube-api-access-6tns6\") pod \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.634812 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-scripts\") pod \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.634855 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data-custom\") pod \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.635651 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-etc-machine-id\") pod \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\" (UID: \"8239b76e-540d-4a4e-bab7-889cd9ee9d3f\") " Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.640274 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8239b76e-540d-4a4e-bab7-889cd9ee9d3f" (UID: "8239b76e-540d-4a4e-bab7-889cd9ee9d3f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.649213 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-scripts" (OuterVolumeSpecName: "scripts") pod "8239b76e-540d-4a4e-bab7-889cd9ee9d3f" (UID: "8239b76e-540d-4a4e-bab7-889cd9ee9d3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.649259 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-kube-api-access-6tns6" (OuterVolumeSpecName: "kube-api-access-6tns6") pod "8239b76e-540d-4a4e-bab7-889cd9ee9d3f" (UID: "8239b76e-540d-4a4e-bab7-889cd9ee9d3f"). InnerVolumeSpecName "kube-api-access-6tns6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.668397 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8239b76e-540d-4a4e-bab7-889cd9ee9d3f" (UID: "8239b76e-540d-4a4e-bab7-889cd9ee9d3f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.703801 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8239b76e-540d-4a4e-bab7-889cd9ee9d3f" (UID: "8239b76e-540d-4a4e-bab7-889cd9ee9d3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.741203 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.741247 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tns6\" (UniqueName: \"kubernetes.io/projected/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-kube-api-access-6tns6\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.741260 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.741269 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.741277 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.814522 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data" (OuterVolumeSpecName: "config-data") pod "8239b76e-540d-4a4e-bab7-889cd9ee9d3f" (UID: "8239b76e-540d-4a4e-bab7-889cd9ee9d3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:00 crc kubenswrapper[4717]: I1007 14:14:00.843294 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8239b76e-540d-4a4e-bab7-889cd9ee9d3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.287762 4717 generic.go:334] "Generic (PLEG): container finished" podID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerID="fea139a5dca934798da1936cba7e8d6ec06e506e509ec62423e472e772639ab0" exitCode=0 Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.287837 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ed9670b-92c3-406b-b8d6-c778a73ed735","Type":"ContainerDied","Data":"fea139a5dca934798da1936cba7e8d6ec06e506e509ec62423e472e772639ab0"} Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.293162 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8239b76e-540d-4a4e-bab7-889cd9ee9d3f","Type":"ContainerDied","Data":"e497cae101c38bb14810ce9c1280819e18e896b653f30c592a6c8ef11552005d"} Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.293236 4717 scope.go:117] "RemoveContainer" containerID="3d04e1bbbd495dfc2b9b5c228bde2d5852a718544d4f656b78477b9512c2e905" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.293238 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.294117 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.294149 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.342174 4717 scope.go:117] "RemoveContainer" containerID="a118246ca00080d5080df0ce6cdabe2cf683953d0dc337fcc159ab1e4e95d4c6" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.345472 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.353615 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.381156 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:14:01 crc kubenswrapper[4717]: E1007 14:14:01.381724 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" containerName="manila-scheduler" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.381746 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" containerName="manila-scheduler" Oct 07 14:14:01 crc kubenswrapper[4717]: E1007 14:14:01.381763 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" containerName="probe" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.381772 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" containerName="probe" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.382033 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" containerName="probe" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.382066 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" containerName="manila-scheduler" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.383481 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.388934 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.410242 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.460387 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.460469 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dd7329a-b653-4574-864a-9d86cbb87ed3-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.460666 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-config-data\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.460863 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvjcp\" (UniqueName: \"kubernetes.io/projected/4dd7329a-b653-4574-864a-9d86cbb87ed3-kube-api-access-xvjcp\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.460933 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-scripts\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.461092 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.490535 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-9fe5-account-create-4vbdk"] Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.492388 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9fe5-account-create-4vbdk" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.494928 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.502246 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9fe5-account-create-4vbdk"] Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.562405 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-config-data\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.562468 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpckm\" (UniqueName: \"kubernetes.io/projected/940dabf8-ee08-4071-bf12-296bae9464b7-kube-api-access-zpckm\") pod \"nova-api-9fe5-account-create-4vbdk\" (UID: \"940dabf8-ee08-4071-bf12-296bae9464b7\") " pod="openstack/nova-api-9fe5-account-create-4vbdk" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.562508 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvjcp\" (UniqueName: \"kubernetes.io/projected/4dd7329a-b653-4574-864a-9d86cbb87ed3-kube-api-access-xvjcp\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.562538 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-scripts\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.562584 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.562626 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.562672 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dd7329a-b653-4574-864a-9d86cbb87ed3-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.562800 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4dd7329a-b653-4574-864a-9d86cbb87ed3-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.569850 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-scripts\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.571253 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-config-data\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.571711 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.582943 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvjcp\" (UniqueName: \"kubernetes.io/projected/4dd7329a-b653-4574-864a-9d86cbb87ed3-kube-api-access-xvjcp\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.582974 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd7329a-b653-4574-864a-9d86cbb87ed3-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"4dd7329a-b653-4574-864a-9d86cbb87ed3\") " pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.664240 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpckm\" (UniqueName: \"kubernetes.io/projected/940dabf8-ee08-4071-bf12-296bae9464b7-kube-api-access-zpckm\") pod \"nova-api-9fe5-account-create-4vbdk\" (UID: \"940dabf8-ee08-4071-bf12-296bae9464b7\") " pod="openstack/nova-api-9fe5-account-create-4vbdk" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.672479 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-6c8b-account-create-7rnnt"] Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.673751 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6c8b-account-create-7rnnt" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.676569 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.692556 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6c8b-account-create-7rnnt"] Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.693831 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpckm\" (UniqueName: \"kubernetes.io/projected/940dabf8-ee08-4071-bf12-296bae9464b7-kube-api-access-zpckm\") pod \"nova-api-9fe5-account-create-4vbdk\" (UID: \"940dabf8-ee08-4071-bf12-296bae9464b7\") " pod="openstack/nova-api-9fe5-account-create-4vbdk" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.724606 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.767394 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5cdb\" (UniqueName: \"kubernetes.io/projected/46139707-cb57-4ea6-8146-624e1bc2e42f-kube-api-access-j5cdb\") pod \"nova-cell0-6c8b-account-create-7rnnt\" (UID: \"46139707-cb57-4ea6-8146-624e1bc2e42f\") " pod="openstack/nova-cell0-6c8b-account-create-7rnnt" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.776452 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.784345 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c898-account-create-ch4ht"] Oct 07 14:14:01 crc kubenswrapper[4717]: E1007 14:14:01.784734 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="proxy-httpd" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.784791 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="proxy-httpd" Oct 07 14:14:01 crc kubenswrapper[4717]: E1007 14:14:01.784807 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="ceilometer-notification-agent" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.784889 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="ceilometer-notification-agent" Oct 07 14:14:01 crc kubenswrapper[4717]: E1007 14:14:01.784928 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="ceilometer-central-agent" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.784936 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="ceilometer-central-agent" Oct 07 14:14:01 crc kubenswrapper[4717]: E1007 14:14:01.784956 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="sg-core" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.784963 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="sg-core" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.785179 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="ceilometer-notification-agent" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.785246 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="ceilometer-central-agent" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.785256 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="sg-core" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.785264 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" containerName="proxy-httpd" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.786346 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c898-account-create-ch4ht" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.793496 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c898-account-create-ch4ht"] Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.802020 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.847421 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9fe5-account-create-4vbdk" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.869744 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjxwl\" (UniqueName: \"kubernetes.io/projected/3ed9670b-92c3-406b-b8d6-c778a73ed735-kube-api-access-cjxwl\") pod \"3ed9670b-92c3-406b-b8d6-c778a73ed735\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.869933 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-config-data\") pod \"3ed9670b-92c3-406b-b8d6-c778a73ed735\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.870035 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-log-httpd\") pod \"3ed9670b-92c3-406b-b8d6-c778a73ed735\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.870061 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-scripts\") pod \"3ed9670b-92c3-406b-b8d6-c778a73ed735\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.870106 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-combined-ca-bundle\") pod \"3ed9670b-92c3-406b-b8d6-c778a73ed735\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.870129 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-ceilometer-tls-certs\") pod \"3ed9670b-92c3-406b-b8d6-c778a73ed735\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.870213 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-sg-core-conf-yaml\") pod \"3ed9670b-92c3-406b-b8d6-c778a73ed735\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.870245 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-run-httpd\") pod \"3ed9670b-92c3-406b-b8d6-c778a73ed735\" (UID: \"3ed9670b-92c3-406b-b8d6-c778a73ed735\") " Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.870722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5cdb\" (UniqueName: \"kubernetes.io/projected/46139707-cb57-4ea6-8146-624e1bc2e42f-kube-api-access-j5cdb\") pod \"nova-cell0-6c8b-account-create-7rnnt\" (UID: \"46139707-cb57-4ea6-8146-624e1bc2e42f\") " pod="openstack/nova-cell0-6c8b-account-create-7rnnt" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.870864 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzfh8\" (UniqueName: \"kubernetes.io/projected/bf7cad25-0861-422f-8ac4-5323c48f28fa-kube-api-access-gzfh8\") pod \"nova-cell1-c898-account-create-ch4ht\" (UID: \"bf7cad25-0861-422f-8ac4-5323c48f28fa\") " pod="openstack/nova-cell1-c898-account-create-ch4ht" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.871347 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3ed9670b-92c3-406b-b8d6-c778a73ed735" (UID: "3ed9670b-92c3-406b-b8d6-c778a73ed735"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.871430 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3ed9670b-92c3-406b-b8d6-c778a73ed735" (UID: "3ed9670b-92c3-406b-b8d6-c778a73ed735"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.892453 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-scripts" (OuterVolumeSpecName: "scripts") pod "3ed9670b-92c3-406b-b8d6-c778a73ed735" (UID: "3ed9670b-92c3-406b-b8d6-c778a73ed735"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.897467 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed9670b-92c3-406b-b8d6-c778a73ed735-kube-api-access-cjxwl" (OuterVolumeSpecName: "kube-api-access-cjxwl") pod "3ed9670b-92c3-406b-b8d6-c778a73ed735" (UID: "3ed9670b-92c3-406b-b8d6-c778a73ed735"). InnerVolumeSpecName "kube-api-access-cjxwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.902979 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5cdb\" (UniqueName: \"kubernetes.io/projected/46139707-cb57-4ea6-8146-624e1bc2e42f-kube-api-access-j5cdb\") pod \"nova-cell0-6c8b-account-create-7rnnt\" (UID: \"46139707-cb57-4ea6-8146-624e1bc2e42f\") " pod="openstack/nova-cell0-6c8b-account-create-7rnnt" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.932563 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3ed9670b-92c3-406b-b8d6-c778a73ed735" (UID: "3ed9670b-92c3-406b-b8d6-c778a73ed735"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.953974 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3ed9670b-92c3-406b-b8d6-c778a73ed735" (UID: "3ed9670b-92c3-406b-b8d6-c778a73ed735"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.974110 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzfh8\" (UniqueName: \"kubernetes.io/projected/bf7cad25-0861-422f-8ac4-5323c48f28fa-kube-api-access-gzfh8\") pod \"nova-cell1-c898-account-create-ch4ht\" (UID: \"bf7cad25-0861-422f-8ac4-5323c48f28fa\") " pod="openstack/nova-cell1-c898-account-create-ch4ht" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.974278 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjxwl\" (UniqueName: \"kubernetes.io/projected/3ed9670b-92c3-406b-b8d6-c778a73ed735-kube-api-access-cjxwl\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.974298 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.974310 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.974321 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.974333 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.974343 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ed9670b-92c3-406b-b8d6-c778a73ed735-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:01 crc kubenswrapper[4717]: I1007 14:14:01.983289 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ed9670b-92c3-406b-b8d6-c778a73ed735" (UID: "3ed9670b-92c3-406b-b8d6-c778a73ed735"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.004342 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzfh8\" (UniqueName: \"kubernetes.io/projected/bf7cad25-0861-422f-8ac4-5323c48f28fa-kube-api-access-gzfh8\") pod \"nova-cell1-c898-account-create-ch4ht\" (UID: \"bf7cad25-0861-422f-8ac4-5323c48f28fa\") " pod="openstack/nova-cell1-c898-account-create-ch4ht" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.036441 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-config-data" (OuterVolumeSpecName: "config-data") pod "3ed9670b-92c3-406b-b8d6-c778a73ed735" (UID: "3ed9670b-92c3-406b-b8d6-c778a73ed735"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.087548 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.087595 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed9670b-92c3-406b-b8d6-c778a73ed735-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.087733 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6c8b-account-create-7rnnt" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.126294 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c898-account-create-ch4ht" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.280057 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 07 14:14:02 crc kubenswrapper[4717]: W1007 14:14:02.304523 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dd7329a_b653_4574_864a_9d86cbb87ed3.slice/crio-e20b3aad6b4777dea54ea5badd55ce8d0fac0b0d476fcfabb096d422db4ebd2b WatchSource:0}: Error finding container e20b3aad6b4777dea54ea5badd55ce8d0fac0b0d476fcfabb096d422db4ebd2b: Status 404 returned error can't find the container with id e20b3aad6b4777dea54ea5badd55ce8d0fac0b0d476fcfabb096d422db4ebd2b Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.308313 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ed9670b-92c3-406b-b8d6-c778a73ed735","Type":"ContainerDied","Data":"4993a830ae77bbd84473a3c2887fb95b617676533ab7b92d376b9606ebed9e1e"} Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.308353 4717 scope.go:117] "RemoveContainer" containerID="77a29cbac9d665a3411f9e8bfef9036efdf829f8c312085ecc76bbe2b50fe197" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.308445 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.353840 4717 scope.go:117] "RemoveContainer" containerID="408d1438c122b7b87b0c7e20190cc0c5a562cce378005c79b07a33b4633065d8" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.361280 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.382940 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.399188 4717 scope.go:117] "RemoveContainer" containerID="0c54e0edc69554f35acfcb2ebafe30c8518631ec9628c3c462a193532b6b942f" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.426525 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.429146 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.444121 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.444307 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.444417 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.449653 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-9fe5-account-create-4vbdk"] Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.464671 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.477718 4717 scope.go:117] "RemoveContainer" containerID="fea139a5dca934798da1936cba7e8d6ec06e506e509ec62423e472e772639ab0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.511107 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-log-httpd\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.511188 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-scripts\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.511228 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.511290 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-run-httpd\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.511318 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrxb\" (UniqueName: \"kubernetes.io/projected/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-kube-api-access-qxrxb\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.511380 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.511427 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-config-data\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.511537 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.616837 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-scripts\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.617134 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.617311 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-run-httpd\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.617401 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrxb\" (UniqueName: \"kubernetes.io/projected/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-kube-api-access-qxrxb\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.617544 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.617657 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-config-data\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.617790 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-run-httpd\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.618334 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.619072 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-log-httpd\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.619634 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-log-httpd\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.623971 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-config-data\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.628444 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.628711 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.632731 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-scripts\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.637793 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.663906 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrxb\" (UniqueName: \"kubernetes.io/projected/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-kube-api-access-qxrxb\") pod \"ceilometer-0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: W1007 14:14:02.664320 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46139707_cb57_4ea6_8146_624e1bc2e42f.slice/crio-ab4d2a6e68fb07fa8eec177c93c62afb3c840c74c08b3ef981c3210426cf8f95 WatchSource:0}: Error finding container ab4d2a6e68fb07fa8eec177c93c62afb3c840c74c08b3ef981c3210426cf8f95: Status 404 returned error can't find the container with id ab4d2a6e68fb07fa8eec177c93c62afb3c840c74c08b3ef981c3210426cf8f95 Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.682512 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6c8b-account-create-7rnnt"] Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.717424 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.739072 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c898-account-create-ch4ht"] Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.894996 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed9670b-92c3-406b-b8d6-c778a73ed735" path="/var/lib/kubelet/pods/3ed9670b-92c3-406b-b8d6-c778a73ed735/volumes" Oct 07 14:14:02 crc kubenswrapper[4717]: I1007 14:14:02.896322 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8239b76e-540d-4a4e-bab7-889cd9ee9d3f" path="/var/lib/kubelet/pods/8239b76e-540d-4a4e-bab7-889cd9ee9d3f/volumes" Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.242299 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.242837 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.246572 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.371850 4717 generic.go:334] "Generic (PLEG): container finished" podID="940dabf8-ee08-4071-bf12-296bae9464b7" containerID="754d72d0b9ce03a416452fec8eb9d32ff82f3b5cfdfff5ae909f39ac5e65b7a2" exitCode=0 Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.371943 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9fe5-account-create-4vbdk" event={"ID":"940dabf8-ee08-4071-bf12-296bae9464b7","Type":"ContainerDied","Data":"754d72d0b9ce03a416452fec8eb9d32ff82f3b5cfdfff5ae909f39ac5e65b7a2"} Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.371973 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9fe5-account-create-4vbdk" event={"ID":"940dabf8-ee08-4071-bf12-296bae9464b7","Type":"ContainerStarted","Data":"882014a0346b2cd75eb62590453f85497b3cd0c9204f4e65bada96950d7a1c24"} Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.379853 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c898-account-create-ch4ht" event={"ID":"bf7cad25-0861-422f-8ac4-5323c48f28fa","Type":"ContainerStarted","Data":"67e18d287c511e4cb04b6b5aaf9a50b715fadf055154978c75eefd8a4b0affe5"} Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.379906 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c898-account-create-ch4ht" event={"ID":"bf7cad25-0861-422f-8ac4-5323c48f28fa","Type":"ContainerStarted","Data":"bc83b6c16ce0ae87bc304a062d3c2023833bb98e8cdd5ab5197ab41945593c8d"} Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.383308 4717 generic.go:334] "Generic (PLEG): container finished" podID="46139707-cb57-4ea6-8146-624e1bc2e42f" containerID="92954b2eeace8644b17de546a6647c2c824fbff3f530d9da7e69859cce182bd0" exitCode=0 Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.383434 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6c8b-account-create-7rnnt" event={"ID":"46139707-cb57-4ea6-8146-624e1bc2e42f","Type":"ContainerDied","Data":"92954b2eeace8644b17de546a6647c2c824fbff3f530d9da7e69859cce182bd0"} Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.383490 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6c8b-account-create-7rnnt" event={"ID":"46139707-cb57-4ea6-8146-624e1bc2e42f","Type":"ContainerStarted","Data":"ab4d2a6e68fb07fa8eec177c93c62afb3c840c74c08b3ef981c3210426cf8f95"} Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.386345 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4dd7329a-b653-4574-864a-9d86cbb87ed3","Type":"ContainerStarted","Data":"7855438c641a29637b1646ac440ff1dc3ddbbb4e4b24a13b98b33ba6f96f6268"} Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.386376 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4dd7329a-b653-4574-864a-9d86cbb87ed3","Type":"ContainerStarted","Data":"e20b3aad6b4777dea54ea5badd55ce8d0fac0b0d476fcfabb096d422db4ebd2b"} Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.389196 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:03 crc kubenswrapper[4717]: I1007 14:14:03.441626 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-c898-account-create-ch4ht" podStartSLOduration=2.441604189 podStartE2EDuration="2.441604189s" podCreationTimestamp="2025-10-07 14:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:03.416501001 +0000 UTC m=+1225.244426793" watchObservedRunningTime="2025-10-07 14:14:03.441604189 +0000 UTC m=+1225.269529981" Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.160301 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.282778 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.283232 4717 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.286713 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.412235 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"4dd7329a-b653-4574-864a-9d86cbb87ed3","Type":"ContainerStarted","Data":"a984320b971a7a64e6a1fd9d8a63cdd695fa23ac84cd5d37b9a586a6c3fee214"} Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.416642 4717 generic.go:334] "Generic (PLEG): container finished" podID="bf7cad25-0861-422f-8ac4-5323c48f28fa" containerID="67e18d287c511e4cb04b6b5aaf9a50b715fadf055154978c75eefd8a4b0affe5" exitCode=0 Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.416724 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c898-account-create-ch4ht" event={"ID":"bf7cad25-0861-422f-8ac4-5323c48f28fa","Type":"ContainerDied","Data":"67e18d287c511e4cb04b6b5aaf9a50b715fadf055154978c75eefd8a4b0affe5"} Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.422048 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0","Type":"ContainerStarted","Data":"31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965"} Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.422147 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0","Type":"ContainerStarted","Data":"bbc7d749b01a7db446aa27e076f1a1e58c039d010c489bdce2110d064bff9e9e"} Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.436165 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.436144274 podStartE2EDuration="3.436144274s" podCreationTimestamp="2025-10-07 14:14:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:04.435912297 +0000 UTC m=+1226.263838109" watchObservedRunningTime="2025-10-07 14:14:04.436144274 +0000 UTC m=+1226.264070066" Oct 07 14:14:04 crc kubenswrapper[4717]: I1007 14:14:04.923491 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6c8b-account-create-7rnnt" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.000795 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5cdb\" (UniqueName: \"kubernetes.io/projected/46139707-cb57-4ea6-8146-624e1bc2e42f-kube-api-access-j5cdb\") pod \"46139707-cb57-4ea6-8146-624e1bc2e42f\" (UID: \"46139707-cb57-4ea6-8146-624e1bc2e42f\") " Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.060226 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46139707-cb57-4ea6-8146-624e1bc2e42f-kube-api-access-j5cdb" (OuterVolumeSpecName: "kube-api-access-j5cdb") pod "46139707-cb57-4ea6-8146-624e1bc2e42f" (UID: "46139707-cb57-4ea6-8146-624e1bc2e42f"). InnerVolumeSpecName "kube-api-access-j5cdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.086693 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9fe5-account-create-4vbdk" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.117463 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5cdb\" (UniqueName: \"kubernetes.io/projected/46139707-cb57-4ea6-8146-624e1bc2e42f-kube-api-access-j5cdb\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.218868 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpckm\" (UniqueName: \"kubernetes.io/projected/940dabf8-ee08-4071-bf12-296bae9464b7-kube-api-access-zpckm\") pod \"940dabf8-ee08-4071-bf12-296bae9464b7\" (UID: \"940dabf8-ee08-4071-bf12-296bae9464b7\") " Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.224668 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940dabf8-ee08-4071-bf12-296bae9464b7-kube-api-access-zpckm" (OuterVolumeSpecName: "kube-api-access-zpckm") pod "940dabf8-ee08-4071-bf12-296bae9464b7" (UID: "940dabf8-ee08-4071-bf12-296bae9464b7"). InnerVolumeSpecName "kube-api-access-zpckm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.326567 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpckm\" (UniqueName: \"kubernetes.io/projected/940dabf8-ee08-4071-bf12-296bae9464b7-kube-api-access-zpckm\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.432253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0","Type":"ContainerStarted","Data":"26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c"} Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.433529 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6c8b-account-create-7rnnt" event={"ID":"46139707-cb57-4ea6-8146-624e1bc2e42f","Type":"ContainerDied","Data":"ab4d2a6e68fb07fa8eec177c93c62afb3c840c74c08b3ef981c3210426cf8f95"} Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.433557 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab4d2a6e68fb07fa8eec177c93c62afb3c840c74c08b3ef981c3210426cf8f95" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.433607 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6c8b-account-create-7rnnt" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.440373 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-9fe5-account-create-4vbdk" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.441168 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-9fe5-account-create-4vbdk" event={"ID":"940dabf8-ee08-4071-bf12-296bae9464b7","Type":"ContainerDied","Data":"882014a0346b2cd75eb62590453f85497b3cd0c9204f4e65bada96950d7a1c24"} Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.441370 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="882014a0346b2cd75eb62590453f85497b3cd0c9204f4e65bada96950d7a1c24" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.717236 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c898-account-create-ch4ht" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.836112 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzfh8\" (UniqueName: \"kubernetes.io/projected/bf7cad25-0861-422f-8ac4-5323c48f28fa-kube-api-access-gzfh8\") pod \"bf7cad25-0861-422f-8ac4-5323c48f28fa\" (UID: \"bf7cad25-0861-422f-8ac4-5323c48f28fa\") " Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.844192 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7cad25-0861-422f-8ac4-5323c48f28fa-kube-api-access-gzfh8" (OuterVolumeSpecName: "kube-api-access-gzfh8") pod "bf7cad25-0861-422f-8ac4-5323c48f28fa" (UID: "bf7cad25-0861-422f-8ac4-5323c48f28fa"). InnerVolumeSpecName "kube-api-access-gzfh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:05 crc kubenswrapper[4717]: I1007 14:14:05.939204 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzfh8\" (UniqueName: \"kubernetes.io/projected/bf7cad25-0861-422f-8ac4-5323c48f28fa-kube-api-access-gzfh8\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:06 crc kubenswrapper[4717]: I1007 14:14:06.449837 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c898-account-create-ch4ht" event={"ID":"bf7cad25-0861-422f-8ac4-5323c48f28fa","Type":"ContainerDied","Data":"bc83b6c16ce0ae87bc304a062d3c2023833bb98e8cdd5ab5197ab41945593c8d"} Oct 07 14:14:06 crc kubenswrapper[4717]: I1007 14:14:06.450146 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc83b6c16ce0ae87bc304a062d3c2023833bb98e8cdd5ab5197ab41945593c8d" Oct 07 14:14:06 crc kubenswrapper[4717]: I1007 14:14:06.449858 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c898-account-create-ch4ht" Oct 07 14:14:06 crc kubenswrapper[4717]: I1007 14:14:06.451491 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0","Type":"ContainerStarted","Data":"22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302"} Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.361628 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j9cmq"] Oct 07 14:14:07 crc kubenswrapper[4717]: E1007 14:14:07.362251 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46139707-cb57-4ea6-8146-624e1bc2e42f" containerName="mariadb-account-create" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.362263 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="46139707-cb57-4ea6-8146-624e1bc2e42f" containerName="mariadb-account-create" Oct 07 14:14:07 crc kubenswrapper[4717]: E1007 14:14:07.362289 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940dabf8-ee08-4071-bf12-296bae9464b7" containerName="mariadb-account-create" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.362295 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="940dabf8-ee08-4071-bf12-296bae9464b7" containerName="mariadb-account-create" Oct 07 14:14:07 crc kubenswrapper[4717]: E1007 14:14:07.362307 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7cad25-0861-422f-8ac4-5323c48f28fa" containerName="mariadb-account-create" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.362313 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7cad25-0861-422f-8ac4-5323c48f28fa" containerName="mariadb-account-create" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.362484 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="46139707-cb57-4ea6-8146-624e1bc2e42f" containerName="mariadb-account-create" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.362504 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf7cad25-0861-422f-8ac4-5323c48f28fa" containerName="mariadb-account-create" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.362540 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="940dabf8-ee08-4071-bf12-296bae9464b7" containerName="mariadb-account-create" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.363270 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.366163 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.366225 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.366224 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bdsrk" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.373452 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j9cmq"] Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.466110 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-scripts\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.466307 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddpsg\" (UniqueName: \"kubernetes.io/projected/c55d6263-cc3d-41fc-8024-06ca7612fece-kube-api-access-ddpsg\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.466435 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-config-data\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.466593 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.497590 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.568446 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-config-data\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.568902 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.570266 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.570520 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-scripts\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.571842 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddpsg\" (UniqueName: \"kubernetes.io/projected/c55d6263-cc3d-41fc-8024-06ca7612fece-kube-api-access-ddpsg\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.583233 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-config-data\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.585623 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-scripts\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.587791 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.588147 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddpsg\" (UniqueName: \"kubernetes.io/projected/c55d6263-cc3d-41fc-8024-06ca7612fece-kube-api-access-ddpsg\") pod \"nova-cell0-conductor-db-sync-j9cmq\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:07 crc kubenswrapper[4717]: I1007 14:14:07.701927 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:08 crc kubenswrapper[4717]: I1007 14:14:08.310345 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j9cmq"] Oct 07 14:14:08 crc kubenswrapper[4717]: I1007 14:14:08.468780 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j9cmq" event={"ID":"c55d6263-cc3d-41fc-8024-06ca7612fece","Type":"ContainerStarted","Data":"496c1be6e4f2760f0b7d68f2a2d6c599a7b6a6014611ab3cdfa43a6cd61a5eeb"} Oct 07 14:14:08 crc kubenswrapper[4717]: I1007 14:14:08.468957 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" containerName="manila-share" containerID="cri-o://ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22" gracePeriod=30 Oct 07 14:14:08 crc kubenswrapper[4717]: I1007 14:14:08.469025 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" containerName="probe" containerID="cri-o://a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2" gracePeriod=30 Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.463376 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.489334 4717 generic.go:334] "Generic (PLEG): container finished" podID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" containerID="a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2" exitCode=0 Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.489383 4717 generic.go:334] "Generic (PLEG): container finished" podID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" containerID="ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22" exitCode=1 Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.489423 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce","Type":"ContainerDied","Data":"a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2"} Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.489453 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce","Type":"ContainerDied","Data":"ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22"} Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.489464 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce","Type":"ContainerDied","Data":"1556651da3dbe2c23a1fcd97a5bd25122b6f4de856f5153b70b13ffabdbe92eb"} Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.489478 4717 scope.go:117] "RemoveContainer" containerID="a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.489589 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.516106 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0","Type":"ContainerStarted","Data":"79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a"} Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.516722 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.558758 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.512177277 podStartE2EDuration="7.558740606s" podCreationTimestamp="2025-10-07 14:14:02 +0000 UTC" firstStartedPulling="2025-10-07 14:14:03.422530646 +0000 UTC m=+1225.250456438" lastFinishedPulling="2025-10-07 14:14:08.469093975 +0000 UTC m=+1230.297019767" observedRunningTime="2025-10-07 14:14:09.547287072 +0000 UTC m=+1231.375212864" watchObservedRunningTime="2025-10-07 14:14:09.558740606 +0000 UTC m=+1231.386666398" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.564469 4717 scope.go:117] "RemoveContainer" containerID="ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.594126 4717 scope.go:117] "RemoveContainer" containerID="a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2" Oct 07 14:14:09 crc kubenswrapper[4717]: E1007 14:14:09.594555 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2\": container with ID starting with a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2 not found: ID does not exist" containerID="a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.594589 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2"} err="failed to get container status \"a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2\": rpc error: code = NotFound desc = could not find container \"a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2\": container with ID starting with a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2 not found: ID does not exist" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.594614 4717 scope.go:117] "RemoveContainer" containerID="ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22" Oct 07 14:14:09 crc kubenswrapper[4717]: E1007 14:14:09.594973 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22\": container with ID starting with ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22 not found: ID does not exist" containerID="ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.595025 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22"} err="failed to get container status \"ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22\": rpc error: code = NotFound desc = could not find container \"ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22\": container with ID starting with ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22 not found: ID does not exist" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.595051 4717 scope.go:117] "RemoveContainer" containerID="a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.595530 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2"} err="failed to get container status \"a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2\": rpc error: code = NotFound desc = could not find container \"a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2\": container with ID starting with a5c7a2dbe0679ebda452750a404a810e4b61418ba37a771d79958ea7f65cc5f2 not found: ID does not exist" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.595551 4717 scope.go:117] "RemoveContainer" containerID="ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.595823 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22"} err="failed to get container status \"ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22\": rpc error: code = NotFound desc = could not find container \"ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22\": container with ID starting with ff2242d1c52a08732ec3b0e753cef7e82ff99edc29a0c68bacb3eb0a7eaaad22 not found: ID does not exist" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.622145 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-var-lib-manila\") pod \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.622413 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" (UID: "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.623305 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data-custom\") pod \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.623400 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data\") pod \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.623426 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-ceph\") pod \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.623506 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dwzt\" (UniqueName: \"kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-kube-api-access-5dwzt\") pod \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.623531 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-combined-ca-bundle\") pod \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.623584 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-scripts\") pod \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.623682 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-etc-machine-id\") pod \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\" (UID: \"f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce\") " Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.624517 4717 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-var-lib-manila\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.625142 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" (UID: "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.649436 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" (UID: "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.649457 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-scripts" (OuterVolumeSpecName: "scripts") pod "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" (UID: "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.649659 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-ceph" (OuterVolumeSpecName: "ceph") pod "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" (UID: "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.653266 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-kube-api-access-5dwzt" (OuterVolumeSpecName: "kube-api-access-5dwzt") pod "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" (UID: "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce"). InnerVolumeSpecName "kube-api-access-5dwzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.687811 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" (UID: "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.726340 4717 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.726375 4717 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-ceph\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.726385 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dwzt\" (UniqueName: \"kubernetes.io/projected/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-kube-api-access-5dwzt\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.726396 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.726404 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.726414 4717 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.774363 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data" (OuterVolumeSpecName: "config-data") pod "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" (UID: "f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.827798 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.924648 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.936628 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.947537 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:14:09 crc kubenswrapper[4717]: E1007 14:14:09.947938 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" containerName="probe" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.947959 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" containerName="probe" Oct 07 14:14:09 crc kubenswrapper[4717]: E1007 14:14:09.947984 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" containerName="manila-share" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.947991 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" containerName="manila-share" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.948172 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" containerName="manila-share" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.948183 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" containerName="probe" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.957809 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.959813 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 07 14:14:09 crc kubenswrapper[4717]: I1007 14:14:09.960945 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.031529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.031632 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7fc0839c-9b96-43c5-9111-5d9c24b471b9-ceph\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.031777 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fc0839c-9b96-43c5-9111-5d9c24b471b9-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.031896 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.031980 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-config-data\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.032034 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrmd8\" (UniqueName: \"kubernetes.io/projected/7fc0839c-9b96-43c5-9111-5d9c24b471b9-kube-api-access-lrmd8\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.032142 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/7fc0839c-9b96-43c5-9111-5d9c24b471b9-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.032222 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-scripts\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.134630 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.134692 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7fc0839c-9b96-43c5-9111-5d9c24b471b9-ceph\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.134768 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fc0839c-9b96-43c5-9111-5d9c24b471b9-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.134835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.134886 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-config-data\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.134906 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrmd8\" (UniqueName: \"kubernetes.io/projected/7fc0839c-9b96-43c5-9111-5d9c24b471b9-kube-api-access-lrmd8\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.134946 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fc0839c-9b96-43c5-9111-5d9c24b471b9-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.134961 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/7fc0839c-9b96-43c5-9111-5d9c24b471b9-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.135037 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-scripts\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.135041 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/7fc0839c-9b96-43c5-9111-5d9c24b471b9-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.139447 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-config-data\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.141080 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.141903 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-scripts\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.142203 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fc0839c-9b96-43c5-9111-5d9c24b471b9-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.158586 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7fc0839c-9b96-43c5-9111-5d9c24b471b9-ceph\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.164718 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrmd8\" (UniqueName: \"kubernetes.io/projected/7fc0839c-9b96-43c5-9111-5d9c24b471b9-kube-api-access-lrmd8\") pod \"manila-share-share1-0\" (UID: \"7fc0839c-9b96-43c5-9111-5d9c24b471b9\") " pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.280668 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.880706 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce" path="/var/lib/kubelet/pods/f9a1e8f9-4aed-44d5-9ca8-4c566328c3ce/volumes" Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.881768 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:10 crc kubenswrapper[4717]: I1007 14:14:10.940154 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 07 14:14:11 crc kubenswrapper[4717]: W1007 14:14:11.018355 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fc0839c_9b96_43c5_9111_5d9c24b471b9.slice/crio-20e676ed12cd8eb36ffc32283cb19632d68d16ae5fc7ca547ee6af445db5abae WatchSource:0}: Error finding container 20e676ed12cd8eb36ffc32283cb19632d68d16ae5fc7ca547ee6af445db5abae: Status 404 returned error can't find the container with id 20e676ed12cd8eb36ffc32283cb19632d68d16ae5fc7ca547ee6af445db5abae Oct 07 14:14:11 crc kubenswrapper[4717]: I1007 14:14:11.547483 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"7fc0839c-9b96-43c5-9111-5d9c24b471b9","Type":"ContainerStarted","Data":"20e676ed12cd8eb36ffc32283cb19632d68d16ae5fc7ca547ee6af445db5abae"} Oct 07 14:14:11 crc kubenswrapper[4717]: I1007 14:14:11.547843 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="ceilometer-central-agent" containerID="cri-o://31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965" gracePeriod=30 Oct 07 14:14:11 crc kubenswrapper[4717]: I1007 14:14:11.547897 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="ceilometer-notification-agent" containerID="cri-o://26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c" gracePeriod=30 Oct 07 14:14:11 crc kubenswrapper[4717]: I1007 14:14:11.547893 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="sg-core" containerID="cri-o://22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302" gracePeriod=30 Oct 07 14:14:11 crc kubenswrapper[4717]: I1007 14:14:11.548014 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="proxy-httpd" containerID="cri-o://79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a" gracePeriod=30 Oct 07 14:14:11 crc kubenswrapper[4717]: I1007 14:14:11.725408 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 07 14:14:12 crc kubenswrapper[4717]: I1007 14:14:12.568777 4717 generic.go:334] "Generic (PLEG): container finished" podID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerID="79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a" exitCode=0 Oct 07 14:14:12 crc kubenswrapper[4717]: I1007 14:14:12.569049 4717 generic.go:334] "Generic (PLEG): container finished" podID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerID="22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302" exitCode=2 Oct 07 14:14:12 crc kubenswrapper[4717]: I1007 14:14:12.569059 4717 generic.go:334] "Generic (PLEG): container finished" podID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerID="26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c" exitCode=0 Oct 07 14:14:12 crc kubenswrapper[4717]: I1007 14:14:12.568893 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0","Type":"ContainerDied","Data":"79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a"} Oct 07 14:14:12 crc kubenswrapper[4717]: I1007 14:14:12.569128 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0","Type":"ContainerDied","Data":"22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302"} Oct 07 14:14:12 crc kubenswrapper[4717]: I1007 14:14:12.569144 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0","Type":"ContainerDied","Data":"26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c"} Oct 07 14:14:12 crc kubenswrapper[4717]: I1007 14:14:12.599390 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"7fc0839c-9b96-43c5-9111-5d9c24b471b9","Type":"ContainerStarted","Data":"f88d42f1084619149f9d847c6acc93d9862b87815293112f26212009b24eeadc"} Oct 07 14:14:12 crc kubenswrapper[4717]: I1007 14:14:12.599433 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"7fc0839c-9b96-43c5-9111-5d9c24b471b9","Type":"ContainerStarted","Data":"500435ac7d3dc9fdec1ecb856888e08e8ed7464f18717b82c7b52e1056b3d11e"} Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.023297 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.055537 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.055512864 podStartE2EDuration="4.055512864s" podCreationTimestamp="2025-10-07 14:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:12.630516923 +0000 UTC m=+1234.458442715" watchObservedRunningTime="2025-10-07 14:14:13.055512864 +0000 UTC m=+1234.883438656" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.107219 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-combined-ca-bundle\") pod \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.107369 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-run-httpd\") pod \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.107397 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-config-data\") pod \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.107423 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-ceilometer-tls-certs\") pod \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.107492 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-sg-core-conf-yaml\") pod \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.107539 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxrxb\" (UniqueName: \"kubernetes.io/projected/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-kube-api-access-qxrxb\") pod \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.107595 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-scripts\") pod \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.107623 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-log-httpd\") pod \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\" (UID: \"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0\") " Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.107768 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" (UID: "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.108070 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.108311 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" (UID: "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.115508 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-scripts" (OuterVolumeSpecName: "scripts") pod "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" (UID: "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.116162 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-kube-api-access-qxrxb" (OuterVolumeSpecName: "kube-api-access-qxrxb") pod "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" (UID: "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0"). InnerVolumeSpecName "kube-api-access-qxrxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.142325 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" (UID: "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.178077 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" (UID: "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.209840 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.210128 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.210250 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxrxb\" (UniqueName: \"kubernetes.io/projected/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-kube-api-access-qxrxb\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.210357 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.210445 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.224991 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" (UID: "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.228521 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-config-data" (OuterVolumeSpecName: "config-data") pod "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" (UID: "ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.312273 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.312309 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.622206 4717 generic.go:334] "Generic (PLEG): container finished" podID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerID="31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965" exitCode=0 Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.622367 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.622399 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0","Type":"ContainerDied","Data":"31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965"} Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.622425 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0","Type":"ContainerDied","Data":"bbc7d749b01a7db446aa27e076f1a1e58c039d010c489bdce2110d064bff9e9e"} Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.622442 4717 scope.go:117] "RemoveContainer" containerID="79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.681168 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.704739 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.729248 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:13 crc kubenswrapper[4717]: E1007 14:14:13.729762 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="ceilometer-central-agent" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.729790 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="ceilometer-central-agent" Oct 07 14:14:13 crc kubenswrapper[4717]: E1007 14:14:13.729823 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="proxy-httpd" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.729832 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="proxy-httpd" Oct 07 14:14:13 crc kubenswrapper[4717]: E1007 14:14:13.729854 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="sg-core" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.729864 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="sg-core" Oct 07 14:14:13 crc kubenswrapper[4717]: E1007 14:14:13.729908 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="ceilometer-notification-agent" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.729918 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="ceilometer-notification-agent" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.730204 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="ceilometer-notification-agent" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.730262 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="ceilometer-central-agent" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.730283 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="sg-core" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.730299 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" containerName="proxy-httpd" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.732533 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.734801 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.736353 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.736545 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.737275 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.823520 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kl8g\" (UniqueName: \"kubernetes.io/projected/3824972b-cd50-49bb-8bd1-3fec2bcafd03-kube-api-access-6kl8g\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.823987 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-run-httpd\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.824144 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.824278 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.824407 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.824554 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-config-data\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.824659 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-scripts\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.824936 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-log-httpd\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.927253 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.927323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.927362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-config-data\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.927386 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-scripts\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.927519 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-log-httpd\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.927579 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kl8g\" (UniqueName: \"kubernetes.io/projected/3824972b-cd50-49bb-8bd1-3fec2bcafd03-kube-api-access-6kl8g\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.927633 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-run-httpd\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.927661 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.936593 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-config-data\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.936744 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.937669 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-scripts\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.943494 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.943614 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.943784 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-run-httpd\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.944409 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-log-httpd\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:13 crc kubenswrapper[4717]: I1007 14:14:13.949321 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kl8g\" (UniqueName: \"kubernetes.io/projected/3824972b-cd50-49bb-8bd1-3fec2bcafd03-kube-api-access-6kl8g\") pod \"ceilometer-0\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " pod="openstack/ceilometer-0" Oct 07 14:14:14 crc kubenswrapper[4717]: I1007 14:14:14.055166 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:14 crc kubenswrapper[4717]: I1007 14:14:14.880277 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0" path="/var/lib/kubelet/pods/ad6b9c11-d4e4-4d60-ad5b-03eb756feeb0/volumes" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.451939 4717 scope.go:117] "RemoveContainer" containerID="22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.506911 4717 scope.go:117] "RemoveContainer" containerID="26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.629045 4717 scope.go:117] "RemoveContainer" containerID="31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.694938 4717 scope.go:117] "RemoveContainer" containerID="79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a" Oct 07 14:14:17 crc kubenswrapper[4717]: E1007 14:14:17.695442 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a\": container with ID starting with 79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a not found: ID does not exist" containerID="79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.695490 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a"} err="failed to get container status \"79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a\": rpc error: code = NotFound desc = could not find container \"79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a\": container with ID starting with 79ee7c505cffc612248c84af6d54626b73ac1f47dfb0bcf1c75a68f4f3878e6a not found: ID does not exist" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.695526 4717 scope.go:117] "RemoveContainer" containerID="22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302" Oct 07 14:14:17 crc kubenswrapper[4717]: E1007 14:14:17.696269 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302\": container with ID starting with 22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302 not found: ID does not exist" containerID="22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.696300 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302"} err="failed to get container status \"22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302\": rpc error: code = NotFound desc = could not find container \"22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302\": container with ID starting with 22a9c5e592cbbd9ddd4c1d19ebe666875666ee7e765a0915e67f3b80d0da9302 not found: ID does not exist" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.696324 4717 scope.go:117] "RemoveContainer" containerID="26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c" Oct 07 14:14:17 crc kubenswrapper[4717]: E1007 14:14:17.696690 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c\": container with ID starting with 26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c not found: ID does not exist" containerID="26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.696716 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c"} err="failed to get container status \"26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c\": rpc error: code = NotFound desc = could not find container \"26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c\": container with ID starting with 26606c2eb9a576b4bbdc63c46258377073f2637f90d32b3a96ab8d703b291f5c not found: ID does not exist" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.696734 4717 scope.go:117] "RemoveContainer" containerID="31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965" Oct 07 14:14:17 crc kubenswrapper[4717]: E1007 14:14:17.696982 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965\": container with ID starting with 31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965 not found: ID does not exist" containerID="31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.697028 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965"} err="failed to get container status \"31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965\": rpc error: code = NotFound desc = could not find container \"31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965\": container with ID starting with 31fe81314991f3101eb75ebfe6bec15f95e88c127f1150b6efc538b226ef0965 not found: ID does not exist" Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.957887 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:17 crc kubenswrapper[4717]: W1007 14:14:17.958839 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3824972b_cd50_49bb_8bd1_3fec2bcafd03.slice/crio-73d0ac15c2fa700bbe7723424b636d7678cf86f1259aaaf972a66d8612024f0a WatchSource:0}: Error finding container 73d0ac15c2fa700bbe7723424b636d7678cf86f1259aaaf972a66d8612024f0a: Status 404 returned error can't find the container with id 73d0ac15c2fa700bbe7723424b636d7678cf86f1259aaaf972a66d8612024f0a Oct 07 14:14:17 crc kubenswrapper[4717]: I1007 14:14:17.961134 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:14:18 crc kubenswrapper[4717]: I1007 14:14:18.682602 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3824972b-cd50-49bb-8bd1-3fec2bcafd03","Type":"ContainerStarted","Data":"73d0ac15c2fa700bbe7723424b636d7678cf86f1259aaaf972a66d8612024f0a"} Oct 07 14:14:18 crc kubenswrapper[4717]: I1007 14:14:18.684836 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j9cmq" event={"ID":"c55d6263-cc3d-41fc-8024-06ca7612fece","Type":"ContainerStarted","Data":"a8b4ce98facd374334e6447a5f20df51dc3aa1749c78a336636e048afd923735"} Oct 07 14:14:18 crc kubenswrapper[4717]: I1007 14:14:18.701354 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-j9cmq" podStartSLOduration=2.629098149 podStartE2EDuration="11.701337696s" podCreationTimestamp="2025-10-07 14:14:07 +0000 UTC" firstStartedPulling="2025-10-07 14:14:08.446498045 +0000 UTC m=+1230.274423837" lastFinishedPulling="2025-10-07 14:14:17.518737592 +0000 UTC m=+1239.346663384" observedRunningTime="2025-10-07 14:14:18.700673728 +0000 UTC m=+1240.528599520" watchObservedRunningTime="2025-10-07 14:14:18.701337696 +0000 UTC m=+1240.529263488" Oct 07 14:14:19 crc kubenswrapper[4717]: I1007 14:14:19.252919 4717 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod1127e471-e2a5-436b-8433-c56124359062"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod1127e471-e2a5-436b-8433-c56124359062] : Timed out while waiting for systemd to remove kubepods-besteffort-pod1127e471_e2a5_436b_8433_c56124359062.slice" Oct 07 14:14:19 crc kubenswrapper[4717]: I1007 14:14:19.696415 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3824972b-cd50-49bb-8bd1-3fec2bcafd03","Type":"ContainerStarted","Data":"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29"} Oct 07 14:14:20 crc kubenswrapper[4717]: I1007 14:14:20.281604 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 07 14:14:20 crc kubenswrapper[4717]: I1007 14:14:20.706393 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3824972b-cd50-49bb-8bd1-3fec2bcafd03","Type":"ContainerStarted","Data":"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201"} Oct 07 14:14:22 crc kubenswrapper[4717]: I1007 14:14:22.731365 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3824972b-cd50-49bb-8bd1-3fec2bcafd03","Type":"ContainerStarted","Data":"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d"} Oct 07 14:14:23 crc kubenswrapper[4717]: I1007 14:14:23.445600 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 07 14:14:24 crc kubenswrapper[4717]: I1007 14:14:24.335560 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:25 crc kubenswrapper[4717]: I1007 14:14:25.760202 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3824972b-cd50-49bb-8bd1-3fec2bcafd03","Type":"ContainerStarted","Data":"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78"} Oct 07 14:14:25 crc kubenswrapper[4717]: I1007 14:14:25.760691 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 14:14:25 crc kubenswrapper[4717]: I1007 14:14:25.760431 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="proxy-httpd" containerID="cri-o://1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78" gracePeriod=30 Oct 07 14:14:25 crc kubenswrapper[4717]: I1007 14:14:25.760358 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="ceilometer-central-agent" containerID="cri-o://08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29" gracePeriod=30 Oct 07 14:14:25 crc kubenswrapper[4717]: I1007 14:14:25.760514 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="sg-core" containerID="cri-o://b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d" gracePeriod=30 Oct 07 14:14:25 crc kubenswrapper[4717]: I1007 14:14:25.760484 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="ceilometer-notification-agent" containerID="cri-o://9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201" gracePeriod=30 Oct 07 14:14:25 crc kubenswrapper[4717]: I1007 14:14:25.787979 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.279049514 podStartE2EDuration="12.787955897s" podCreationTimestamp="2025-10-07 14:14:13 +0000 UTC" firstStartedPulling="2025-10-07 14:14:17.960771392 +0000 UTC m=+1239.788697194" lastFinishedPulling="2025-10-07 14:14:24.469677775 +0000 UTC m=+1246.297603577" observedRunningTime="2025-10-07 14:14:25.781553261 +0000 UTC m=+1247.609479053" watchObservedRunningTime="2025-10-07 14:14:25.787955897 +0000 UTC m=+1247.615881689" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.519238 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.572830 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-run-httpd\") pod \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.572899 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-config-data\") pod \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.572967 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kl8g\" (UniqueName: \"kubernetes.io/projected/3824972b-cd50-49bb-8bd1-3fec2bcafd03-kube-api-access-6kl8g\") pod \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.573060 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-combined-ca-bundle\") pod \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.573085 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-sg-core-conf-yaml\") pod \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.573108 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-log-httpd\") pod \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.573252 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-ceilometer-tls-certs\") pod \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.573289 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-scripts\") pod \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\" (UID: \"3824972b-cd50-49bb-8bd1-3fec2bcafd03\") " Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.574418 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3824972b-cd50-49bb-8bd1-3fec2bcafd03" (UID: "3824972b-cd50-49bb-8bd1-3fec2bcafd03"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.574529 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3824972b-cd50-49bb-8bd1-3fec2bcafd03" (UID: "3824972b-cd50-49bb-8bd1-3fec2bcafd03"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.582365 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-scripts" (OuterVolumeSpecName: "scripts") pod "3824972b-cd50-49bb-8bd1-3fec2bcafd03" (UID: "3824972b-cd50-49bb-8bd1-3fec2bcafd03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.583531 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3824972b-cd50-49bb-8bd1-3fec2bcafd03-kube-api-access-6kl8g" (OuterVolumeSpecName: "kube-api-access-6kl8g") pod "3824972b-cd50-49bb-8bd1-3fec2bcafd03" (UID: "3824972b-cd50-49bb-8bd1-3fec2bcafd03"). InnerVolumeSpecName "kube-api-access-6kl8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.612549 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3824972b-cd50-49bb-8bd1-3fec2bcafd03" (UID: "3824972b-cd50-49bb-8bd1-3fec2bcafd03"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.641437 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3824972b-cd50-49bb-8bd1-3fec2bcafd03" (UID: "3824972b-cd50-49bb-8bd1-3fec2bcafd03"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.675509 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.675544 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.675558 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.675569 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kl8g\" (UniqueName: \"kubernetes.io/projected/3824972b-cd50-49bb-8bd1-3fec2bcafd03-kube-api-access-6kl8g\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.675581 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.675590 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3824972b-cd50-49bb-8bd1-3fec2bcafd03-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.680104 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3824972b-cd50-49bb-8bd1-3fec2bcafd03" (UID: "3824972b-cd50-49bb-8bd1-3fec2bcafd03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.684812 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-config-data" (OuterVolumeSpecName: "config-data") pod "3824972b-cd50-49bb-8bd1-3fec2bcafd03" (UID: "3824972b-cd50-49bb-8bd1-3fec2bcafd03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.775693 4717 generic.go:334] "Generic (PLEG): container finished" podID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerID="1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78" exitCode=0 Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.775726 4717 generic.go:334] "Generic (PLEG): container finished" podID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerID="b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d" exitCode=2 Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.775734 4717 generic.go:334] "Generic (PLEG): container finished" podID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerID="9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201" exitCode=0 Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.775742 4717 generic.go:334] "Generic (PLEG): container finished" podID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerID="08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29" exitCode=0 Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.775760 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3824972b-cd50-49bb-8bd1-3fec2bcafd03","Type":"ContainerDied","Data":"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78"} Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.775855 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3824972b-cd50-49bb-8bd1-3fec2bcafd03","Type":"ContainerDied","Data":"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d"} Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.775865 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3824972b-cd50-49bb-8bd1-3fec2bcafd03","Type":"ContainerDied","Data":"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201"} Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.775875 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3824972b-cd50-49bb-8bd1-3fec2bcafd03","Type":"ContainerDied","Data":"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29"} Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.775883 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3824972b-cd50-49bb-8bd1-3fec2bcafd03","Type":"ContainerDied","Data":"73d0ac15c2fa700bbe7723424b636d7678cf86f1259aaaf972a66d8612024f0a"} Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.775898 4717 scope.go:117] "RemoveContainer" containerID="1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.776029 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.783606 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.783687 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3824972b-cd50-49bb-8bd1-3fec2bcafd03-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.814542 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.822289 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.832210 4717 scope.go:117] "RemoveContainer" containerID="b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.841597 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:26 crc kubenswrapper[4717]: E1007 14:14:26.842055 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="sg-core" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.842072 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="sg-core" Oct 07 14:14:26 crc kubenswrapper[4717]: E1007 14:14:26.842088 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="ceilometer-central-agent" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.842095 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="ceilometer-central-agent" Oct 07 14:14:26 crc kubenswrapper[4717]: E1007 14:14:26.842104 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="ceilometer-notification-agent" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.842110 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="ceilometer-notification-agent" Oct 07 14:14:26 crc kubenswrapper[4717]: E1007 14:14:26.842120 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="proxy-httpd" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.842125 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="proxy-httpd" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.842316 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="sg-core" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.842331 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="ceilometer-notification-agent" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.842340 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="ceilometer-central-agent" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.842350 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" containerName="proxy-httpd" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.844084 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.846641 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.846816 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.846963 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.859348 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.864026 4717 scope.go:117] "RemoveContainer" containerID="9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.889431 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wqwq\" (UniqueName: \"kubernetes.io/projected/858c9339-f223-4890-a789-9e88ebf6d5ab-kube-api-access-4wqwq\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.889683 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.889784 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-scripts\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.889812 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-log-httpd\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.889849 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.889912 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-run-httpd\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.889950 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-config-data\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.889966 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.893447 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3824972b-cd50-49bb-8bd1-3fec2bcafd03" path="/var/lib/kubelet/pods/3824972b-cd50-49bb-8bd1-3fec2bcafd03/volumes" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.952076 4717 scope.go:117] "RemoveContainer" containerID="08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.972703 4717 scope.go:117] "RemoveContainer" containerID="1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78" Oct 07 14:14:26 crc kubenswrapper[4717]: E1007 14:14:26.973173 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78\": container with ID starting with 1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78 not found: ID does not exist" containerID="1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.973229 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78"} err="failed to get container status \"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78\": rpc error: code = NotFound desc = could not find container \"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78\": container with ID starting with 1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.973260 4717 scope.go:117] "RemoveContainer" containerID="b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d" Oct 07 14:14:26 crc kubenswrapper[4717]: E1007 14:14:26.973575 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d\": container with ID starting with b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d not found: ID does not exist" containerID="b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.973602 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d"} err="failed to get container status \"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d\": rpc error: code = NotFound desc = could not find container \"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d\": container with ID starting with b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.973617 4717 scope.go:117] "RemoveContainer" containerID="9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201" Oct 07 14:14:26 crc kubenswrapper[4717]: E1007 14:14:26.973833 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201\": container with ID starting with 9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201 not found: ID does not exist" containerID="9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.973856 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201"} err="failed to get container status \"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201\": rpc error: code = NotFound desc = could not find container \"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201\": container with ID starting with 9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.973868 4717 scope.go:117] "RemoveContainer" containerID="08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29" Oct 07 14:14:26 crc kubenswrapper[4717]: E1007 14:14:26.974500 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29\": container with ID starting with 08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29 not found: ID does not exist" containerID="08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.974521 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29"} err="failed to get container status \"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29\": rpc error: code = NotFound desc = could not find container \"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29\": container with ID starting with 08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.974534 4717 scope.go:117] "RemoveContainer" containerID="1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.974841 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78"} err="failed to get container status \"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78\": rpc error: code = NotFound desc = could not find container \"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78\": container with ID starting with 1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.974870 4717 scope.go:117] "RemoveContainer" containerID="b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.975295 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d"} err="failed to get container status \"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d\": rpc error: code = NotFound desc = could not find container \"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d\": container with ID starting with b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.975362 4717 scope.go:117] "RemoveContainer" containerID="9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.975632 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201"} err="failed to get container status \"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201\": rpc error: code = NotFound desc = could not find container \"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201\": container with ID starting with 9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.975672 4717 scope.go:117] "RemoveContainer" containerID="08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.975924 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29"} err="failed to get container status \"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29\": rpc error: code = NotFound desc = could not find container \"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29\": container with ID starting with 08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.975951 4717 scope.go:117] "RemoveContainer" containerID="1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.976388 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78"} err="failed to get container status \"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78\": rpc error: code = NotFound desc = could not find container \"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78\": container with ID starting with 1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.976411 4717 scope.go:117] "RemoveContainer" containerID="b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.976666 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d"} err="failed to get container status \"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d\": rpc error: code = NotFound desc = could not find container \"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d\": container with ID starting with b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.976686 4717 scope.go:117] "RemoveContainer" containerID="9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.976993 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201"} err="failed to get container status \"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201\": rpc error: code = NotFound desc = could not find container \"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201\": container with ID starting with 9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.977035 4717 scope.go:117] "RemoveContainer" containerID="08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.977244 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29"} err="failed to get container status \"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29\": rpc error: code = NotFound desc = could not find container \"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29\": container with ID starting with 08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.977263 4717 scope.go:117] "RemoveContainer" containerID="1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.977462 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78"} err="failed to get container status \"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78\": rpc error: code = NotFound desc = could not find container \"1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78\": container with ID starting with 1fa3dc516f76fb2dbfbfe02d0823e3ea34c4136a064a5dc0d8224017aefe7e78 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.977484 4717 scope.go:117] "RemoveContainer" containerID="b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.977718 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d"} err="failed to get container status \"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d\": rpc error: code = NotFound desc = could not find container \"b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d\": container with ID starting with b53c6fcc2c5c0d7b5b3cf18a2b5a11f4824d2f2a4daa316b50010bd80b4b5a6d not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.977745 4717 scope.go:117] "RemoveContainer" containerID="9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.978071 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201"} err="failed to get container status \"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201\": rpc error: code = NotFound desc = could not find container \"9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201\": container with ID starting with 9ec944d493771f1b95e7b127d71acfb76c77a17b25a8837ea4c81ff5c1670201 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.978091 4717 scope.go:117] "RemoveContainer" containerID="08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.978316 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29"} err="failed to get container status \"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29\": rpc error: code = NotFound desc = could not find container \"08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29\": container with ID starting with 08f1a4d0fa9c12a2b8608da2616563e84e5b1f74a3f852b695ded201e588fb29 not found: ID does not exist" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.991571 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.991722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-scripts\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.991749 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-log-httpd\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.991783 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.991819 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-run-httpd\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.991844 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-config-data\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.991860 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.991910 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wqwq\" (UniqueName: \"kubernetes.io/projected/858c9339-f223-4890-a789-9e88ebf6d5ab-kube-api-access-4wqwq\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.993528 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-run-httpd\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:26 crc kubenswrapper[4717]: I1007 14:14:26.994132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-log-httpd\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:27 crc kubenswrapper[4717]: I1007 14:14:27.005544 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:27 crc kubenswrapper[4717]: I1007 14:14:27.005628 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-scripts\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:27 crc kubenswrapper[4717]: I1007 14:14:27.005748 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:27 crc kubenswrapper[4717]: I1007 14:14:27.006267 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-config-data\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:27 crc kubenswrapper[4717]: I1007 14:14:27.006343 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:27 crc kubenswrapper[4717]: I1007 14:14:27.009131 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wqwq\" (UniqueName: \"kubernetes.io/projected/858c9339-f223-4890-a789-9e88ebf6d5ab-kube-api-access-4wqwq\") pod \"ceilometer-0\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " pod="openstack/ceilometer-0" Oct 07 14:14:27 crc kubenswrapper[4717]: I1007 14:14:27.163843 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:14:27 crc kubenswrapper[4717]: I1007 14:14:27.650883 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:14:27 crc kubenswrapper[4717]: I1007 14:14:27.785025 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"858c9339-f223-4890-a789-9e88ebf6d5ab","Type":"ContainerStarted","Data":"6762ab9c17c3b8ca52772d3632daae5a68002f845ca17fb14fa5e567cec6356f"} Oct 07 14:14:28 crc kubenswrapper[4717]: I1007 14:14:28.795176 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"858c9339-f223-4890-a789-9e88ebf6d5ab","Type":"ContainerStarted","Data":"9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d"} Oct 07 14:14:29 crc kubenswrapper[4717]: I1007 14:14:29.815711 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"858c9339-f223-4890-a789-9e88ebf6d5ab","Type":"ContainerStarted","Data":"29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b"} Oct 07 14:14:30 crc kubenswrapper[4717]: I1007 14:14:30.829502 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"858c9339-f223-4890-a789-9e88ebf6d5ab","Type":"ContainerStarted","Data":"19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f"} Oct 07 14:14:31 crc kubenswrapper[4717]: I1007 14:14:31.841074 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"858c9339-f223-4890-a789-9e88ebf6d5ab","Type":"ContainerStarted","Data":"cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6"} Oct 07 14:14:31 crc kubenswrapper[4717]: I1007 14:14:31.842544 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 14:14:31 crc kubenswrapper[4717]: I1007 14:14:31.846564 4717 generic.go:334] "Generic (PLEG): container finished" podID="c55d6263-cc3d-41fc-8024-06ca7612fece" containerID="a8b4ce98facd374334e6447a5f20df51dc3aa1749c78a336636e048afd923735" exitCode=0 Oct 07 14:14:31 crc kubenswrapper[4717]: I1007 14:14:31.846596 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j9cmq" event={"ID":"c55d6263-cc3d-41fc-8024-06ca7612fece","Type":"ContainerDied","Data":"a8b4ce98facd374334e6447a5f20df51dc3aa1749c78a336636e048afd923735"} Oct 07 14:14:31 crc kubenswrapper[4717]: I1007 14:14:31.866723 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.0499106559999998 podStartE2EDuration="5.866705958s" podCreationTimestamp="2025-10-07 14:14:26 +0000 UTC" firstStartedPulling="2025-10-07 14:14:27.657062609 +0000 UTC m=+1249.484988401" lastFinishedPulling="2025-10-07 14:14:31.473857911 +0000 UTC m=+1253.301783703" observedRunningTime="2025-10-07 14:14:31.866472331 +0000 UTC m=+1253.694398123" watchObservedRunningTime="2025-10-07 14:14:31.866705958 +0000 UTC m=+1253.694631750" Oct 07 14:14:31 crc kubenswrapper[4717]: I1007 14:14:31.894532 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.288091 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.428398 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddpsg\" (UniqueName: \"kubernetes.io/projected/c55d6263-cc3d-41fc-8024-06ca7612fece-kube-api-access-ddpsg\") pod \"c55d6263-cc3d-41fc-8024-06ca7612fece\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.428599 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-scripts\") pod \"c55d6263-cc3d-41fc-8024-06ca7612fece\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.428673 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-combined-ca-bundle\") pod \"c55d6263-cc3d-41fc-8024-06ca7612fece\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.428759 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-config-data\") pod \"c55d6263-cc3d-41fc-8024-06ca7612fece\" (UID: \"c55d6263-cc3d-41fc-8024-06ca7612fece\") " Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.439286 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55d6263-cc3d-41fc-8024-06ca7612fece-kube-api-access-ddpsg" (OuterVolumeSpecName: "kube-api-access-ddpsg") pod "c55d6263-cc3d-41fc-8024-06ca7612fece" (UID: "c55d6263-cc3d-41fc-8024-06ca7612fece"). InnerVolumeSpecName "kube-api-access-ddpsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.439392 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-scripts" (OuterVolumeSpecName: "scripts") pod "c55d6263-cc3d-41fc-8024-06ca7612fece" (UID: "c55d6263-cc3d-41fc-8024-06ca7612fece"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.454550 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c55d6263-cc3d-41fc-8024-06ca7612fece" (UID: "c55d6263-cc3d-41fc-8024-06ca7612fece"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.456843 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-config-data" (OuterVolumeSpecName: "config-data") pod "c55d6263-cc3d-41fc-8024-06ca7612fece" (UID: "c55d6263-cc3d-41fc-8024-06ca7612fece"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.531367 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.531681 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.531798 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddpsg\" (UniqueName: \"kubernetes.io/projected/c55d6263-cc3d-41fc-8024-06ca7612fece-kube-api-access-ddpsg\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.531887 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c55d6263-cc3d-41fc-8024-06ca7612fece-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.864823 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-j9cmq" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.864851 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-j9cmq" event={"ID":"c55d6263-cc3d-41fc-8024-06ca7612fece","Type":"ContainerDied","Data":"496c1be6e4f2760f0b7d68f2a2d6c599a7b6a6014611ab3cdfa43a6cd61a5eeb"} Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.864891 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="496c1be6e4f2760f0b7d68f2a2d6c599a7b6a6014611ab3cdfa43a6cd61a5eeb" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.993357 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 14:14:33 crc kubenswrapper[4717]: E1007 14:14:33.994059 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55d6263-cc3d-41fc-8024-06ca7612fece" containerName="nova-cell0-conductor-db-sync" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.994133 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55d6263-cc3d-41fc-8024-06ca7612fece" containerName="nova-cell0-conductor-db-sync" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.994495 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55d6263-cc3d-41fc-8024-06ca7612fece" containerName="nova-cell0-conductor-db-sync" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.995784 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:33 crc kubenswrapper[4717]: I1007 14:14:33.998982 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.000819 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bdsrk" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.004978 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.142495 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsfzq\" (UniqueName: \"kubernetes.io/projected/cb822ac3-55c1-4745-bc6d-570b89e66108-kube-api-access-jsfzq\") pod \"nova-cell0-conductor-0\" (UID: \"cb822ac3-55c1-4745-bc6d-570b89e66108\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.142663 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb822ac3-55c1-4745-bc6d-570b89e66108-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cb822ac3-55c1-4745-bc6d-570b89e66108\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.142716 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb822ac3-55c1-4745-bc6d-570b89e66108-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cb822ac3-55c1-4745-bc6d-570b89e66108\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.244323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsfzq\" (UniqueName: \"kubernetes.io/projected/cb822ac3-55c1-4745-bc6d-570b89e66108-kube-api-access-jsfzq\") pod \"nova-cell0-conductor-0\" (UID: \"cb822ac3-55c1-4745-bc6d-570b89e66108\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.244428 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb822ac3-55c1-4745-bc6d-570b89e66108-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cb822ac3-55c1-4745-bc6d-570b89e66108\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.244469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb822ac3-55c1-4745-bc6d-570b89e66108-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cb822ac3-55c1-4745-bc6d-570b89e66108\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.249502 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb822ac3-55c1-4745-bc6d-570b89e66108-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"cb822ac3-55c1-4745-bc6d-570b89e66108\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.261671 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsfzq\" (UniqueName: \"kubernetes.io/projected/cb822ac3-55c1-4745-bc6d-570b89e66108-kube-api-access-jsfzq\") pod \"nova-cell0-conductor-0\" (UID: \"cb822ac3-55c1-4745-bc6d-570b89e66108\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.269157 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb822ac3-55c1-4745-bc6d-570b89e66108-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"cb822ac3-55c1-4745-bc6d-570b89e66108\") " pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.333057 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.783070 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 07 14:14:34 crc kubenswrapper[4717]: I1007 14:14:34.879898 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cb822ac3-55c1-4745-bc6d-570b89e66108","Type":"ContainerStarted","Data":"88bb1c058548d90a27ebc292a721fc7a4baf87f8d81bb62a2db1a7f377738afb"} Oct 07 14:14:35 crc kubenswrapper[4717]: I1007 14:14:35.890192 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"cb822ac3-55c1-4745-bc6d-570b89e66108","Type":"ContainerStarted","Data":"372c7e627f968233dcd2b04c421faa0b1f5ad45457a1cd9caf93f9b892d4b918"} Oct 07 14:14:35 crc kubenswrapper[4717]: I1007 14:14:35.890596 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:35 crc kubenswrapper[4717]: I1007 14:14:35.915260 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.915236089 podStartE2EDuration="2.915236089s" podCreationTimestamp="2025-10-07 14:14:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:35.904930486 +0000 UTC m=+1257.732856278" watchObservedRunningTime="2025-10-07 14:14:35.915236089 +0000 UTC m=+1257.743161881" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.359177 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.786420 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-s4mqg"] Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.787893 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.790954 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.792840 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.799356 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-s4mqg"] Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.847606 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-config-data\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.847724 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.847804 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-scripts\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.847980 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcr78\" (UniqueName: \"kubernetes.io/projected/e4d74639-11be-451b-ad1f-56897155fc06-kube-api-access-tcr78\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.952332 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-config-data\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.952489 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.952581 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-scripts\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.952748 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcr78\" (UniqueName: \"kubernetes.io/projected/e4d74639-11be-451b-ad1f-56897155fc06-kube-api-access-tcr78\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.954207 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.955925 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.966224 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-scripts\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.967043 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.974559 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.982514 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:14:39 crc kubenswrapper[4717]: I1007 14:14:39.992629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-config-data\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.015629 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcr78\" (UniqueName: \"kubernetes.io/projected/e4d74639-11be-451b-ad1f-56897155fc06-kube-api-access-tcr78\") pod \"nova-cell0-cell-mapping-s4mqg\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.078069 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.079842 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.082434 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.105489 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.124171 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.166337 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7whf\" (UniqueName: \"kubernetes.io/projected/a9724b0f-14b3-435d-9b32-a2b9f862ae67-kube-api-access-h7whf\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.166733 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.166781 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-config-data\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.166890 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-config-data\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.166945 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hbg4\" (UniqueName: \"kubernetes.io/projected/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-kube-api-access-6hbg4\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.167052 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-logs\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.167083 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.167153 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9724b0f-14b3-435d-9b32-a2b9f862ae67-logs\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.177869 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.179291 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.184076 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.234313 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.254456 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.266133 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.267361 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.268512 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9724b0f-14b3-435d-9b32-a2b9f862ae67-logs\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.268576 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7whf\" (UniqueName: \"kubernetes.io/projected/a9724b0f-14b3-435d-9b32-a2b9f862ae67-kube-api-access-h7whf\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.268621 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.268650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-config-data\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.268712 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-config-data\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.268760 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hbg4\" (UniqueName: \"kubernetes.io/projected/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-kube-api-access-6hbg4\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.268813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-logs\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.268839 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.270364 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9724b0f-14b3-435d-9b32-a2b9f862ae67-logs\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.273973 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-logs\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.286714 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.292943 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-config-data\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.293698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.293745 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.297072 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.300810 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.310504 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-config-data\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.316955 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7whf\" (UniqueName: \"kubernetes.io/projected/a9724b0f-14b3-435d-9b32-a2b9f862ae67-kube-api-access-h7whf\") pod \"nova-api-0\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.320996 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hbg4\" (UniqueName: \"kubernetes.io/projected/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-kube-api-access-6hbg4\") pod \"nova-metadata-0\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.344592 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.370913 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j44dx\" (UniqueName: \"kubernetes.io/projected/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-kube-api-access-j44dx\") pod \"nova-scheduler-0\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.371048 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-config-data\") pod \"nova-scheduler-0\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.371115 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.371142 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.371160 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-svc\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.371214 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.371339 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lhw2\" (UniqueName: \"kubernetes.io/projected/2acf8450-e2d5-404b-a98c-d9efba061461-kube-api-access-8lhw2\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.371374 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-config\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.371401 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.420271 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.473985 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474058 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d4cd\" (UniqueName: \"kubernetes.io/projected/fdb35360-6956-47bd-956b-1d17e8c72024-kube-api-access-8d4cd\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474081 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lhw2\" (UniqueName: \"kubernetes.io/projected/2acf8450-e2d5-404b-a98c-d9efba061461-kube-api-access-8lhw2\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474115 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-config\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474149 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474207 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j44dx\" (UniqueName: \"kubernetes.io/projected/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-kube-api-access-j44dx\") pod \"nova-scheduler-0\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474260 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474296 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-config-data\") pod \"nova-scheduler-0\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474327 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474351 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474378 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.474568 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-svc\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.475722 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-config\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.475884 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-svc\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.476145 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-sb\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.476284 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-nb\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.480520 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-swift-storage-0\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.489413 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.492203 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-config-data\") pod \"nova-scheduler-0\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.492949 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lhw2\" (UniqueName: \"kubernetes.io/projected/2acf8450-e2d5-404b-a98c-d9efba061461-kube-api-access-8lhw2\") pod \"dnsmasq-dns-7d5fbbb8c5-fwxxg\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.495970 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j44dx\" (UniqueName: \"kubernetes.io/projected/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-kube-api-access-j44dx\") pod \"nova-scheduler-0\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.528025 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.556650 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.576451 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d4cd\" (UniqueName: \"kubernetes.io/projected/fdb35360-6956-47bd-956b-1d17e8c72024-kube-api-access-8d4cd\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.576593 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.576635 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.581648 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.583094 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.593573 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d4cd\" (UniqueName: \"kubernetes.io/projected/fdb35360-6956-47bd-956b-1d17e8c72024-kube-api-access-8d4cd\") pod \"nova-cell1-novncproxy-0\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.628721 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.635844 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:40 crc kubenswrapper[4717]: W1007 14:14:40.930148 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4d74639_11be_451b_ad1f_56897155fc06.slice/crio-cbca9c02a9ddcffc10c8c1facef48ea98fdff8ebfe6139804a054b9e0001043b WatchSource:0}: Error finding container cbca9c02a9ddcffc10c8c1facef48ea98fdff8ebfe6139804a054b9e0001043b: Status 404 returned error can't find the container with id cbca9c02a9ddcffc10c8c1facef48ea98fdff8ebfe6139804a054b9e0001043b Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.932662 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-s4mqg"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.982199 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m6bpx"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.988330 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s4mqg" event={"ID":"e4d74639-11be-451b-ad1f-56897155fc06","Type":"ContainerStarted","Data":"cbca9c02a9ddcffc10c8c1facef48ea98fdff8ebfe6139804a054b9e0001043b"} Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.990211 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.993745 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m6bpx"] Oct 07 14:14:40 crc kubenswrapper[4717]: I1007 14:14:40.994775 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.001205 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.035229 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.097078 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xcq8\" (UniqueName: \"kubernetes.io/projected/48bc633c-b89e-4eff-957b-9f2cd14038bb-kube-api-access-5xcq8\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.097229 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-scripts\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.097277 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.097378 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-config-data\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.199736 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-scripts\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.199832 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.199918 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-config-data\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.200640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xcq8\" (UniqueName: \"kubernetes.io/projected/48bc633c-b89e-4eff-957b-9f2cd14038bb-kube-api-access-5xcq8\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.209044 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.228107 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-scripts\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.229093 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-config-data\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.229148 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xcq8\" (UniqueName: \"kubernetes.io/projected/48bc633c-b89e-4eff-957b-9f2cd14038bb-kube-api-access-5xcq8\") pod \"nova-cell1-conductor-db-sync-m6bpx\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.283793 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.323288 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.344991 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.362957 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.473898 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg"] Oct 07 14:14:41 crc kubenswrapper[4717]: I1007 14:14:41.832244 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m6bpx"] Oct 07 14:14:42 crc kubenswrapper[4717]: I1007 14:14:42.009241 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m6bpx" event={"ID":"48bc633c-b89e-4eff-957b-9f2cd14038bb","Type":"ContainerStarted","Data":"c91c295c7280bc9af26b319191bca25e1f3a41a28a7278946b382c327387c643"} Oct 07 14:14:42 crc kubenswrapper[4717]: I1007 14:14:42.014774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9724b0f-14b3-435d-9b32-a2b9f862ae67","Type":"ContainerStarted","Data":"989a1fcc60b24ed7efc0717d27c44cf96508de701346745f5f07634d5e1d3df2"} Oct 07 14:14:42 crc kubenswrapper[4717]: I1007 14:14:42.025243 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s4mqg" event={"ID":"e4d74639-11be-451b-ad1f-56897155fc06","Type":"ContainerStarted","Data":"246d2fc2425512ceccdf324e7c83ce6724aec8ab2ee2beea75e1291fcdadefb3"} Oct 07 14:14:42 crc kubenswrapper[4717]: I1007 14:14:42.042416 4717 generic.go:334] "Generic (PLEG): container finished" podID="2acf8450-e2d5-404b-a98c-d9efba061461" containerID="3d5b8fb06e24ec12ee704c978682cc77f8ac65ea3492d500405afb487e1c6eab" exitCode=0 Oct 07 14:14:42 crc kubenswrapper[4717]: I1007 14:14:42.042692 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" event={"ID":"2acf8450-e2d5-404b-a98c-d9efba061461","Type":"ContainerDied","Data":"3d5b8fb06e24ec12ee704c978682cc77f8ac65ea3492d500405afb487e1c6eab"} Oct 07 14:14:42 crc kubenswrapper[4717]: I1007 14:14:42.042721 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" event={"ID":"2acf8450-e2d5-404b-a98c-d9efba061461","Type":"ContainerStarted","Data":"f583b8773e8d07b3474de1529de409d3532416d433d5bac0510f3feb21961f7d"} Oct 07 14:14:42 crc kubenswrapper[4717]: I1007 14:14:42.051596 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-s4mqg" podStartSLOduration=3.0515738040000002 podStartE2EDuration="3.051573804s" podCreationTimestamp="2025-10-07 14:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:42.042076333 +0000 UTC m=+1263.870002125" watchObservedRunningTime="2025-10-07 14:14:42.051573804 +0000 UTC m=+1263.879499596" Oct 07 14:14:42 crc kubenswrapper[4717]: I1007 14:14:42.054284 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdb35360-6956-47bd-956b-1d17e8c72024","Type":"ContainerStarted","Data":"59f2d0c945b410b02f3ec7e765938253c2e0a71882b704c0c56530b080ac1ca8"} Oct 07 14:14:42 crc kubenswrapper[4717]: I1007 14:14:42.065260 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335","Type":"ContainerStarted","Data":"feebf2e3cdab2626d5ce62dd0dc8b4da0b31856f0eb5964504c52b42f7e34ab1"} Oct 07 14:14:42 crc kubenswrapper[4717]: I1007 14:14:42.078941 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a7467c-3d2c-4946-970b-ac3c5ddef8e5","Type":"ContainerStarted","Data":"405e9329396969aebcc739c1f2537337bc28b70a32d91edc5177e12869088d93"} Oct 07 14:14:43 crc kubenswrapper[4717]: I1007 14:14:43.102043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" event={"ID":"2acf8450-e2d5-404b-a98c-d9efba061461","Type":"ContainerStarted","Data":"9f203292f2c2f5c8988fb8077209b7b1fba52c253faac8718fd7871b2117a0b8"} Oct 07 14:14:43 crc kubenswrapper[4717]: I1007 14:14:43.102371 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:43 crc kubenswrapper[4717]: I1007 14:14:43.110469 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m6bpx" event={"ID":"48bc633c-b89e-4eff-957b-9f2cd14038bb","Type":"ContainerStarted","Data":"69506bce28f7bce0d8a2f673a3affe898d6303fd2dc4ccf5d827f2559359c315"} Oct 07 14:14:43 crc kubenswrapper[4717]: I1007 14:14:43.134224 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" podStartSLOduration=3.134200109 podStartE2EDuration="3.134200109s" podCreationTimestamp="2025-10-07 14:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:43.126650581 +0000 UTC m=+1264.954576373" watchObservedRunningTime="2025-10-07 14:14:43.134200109 +0000 UTC m=+1264.962125901" Oct 07 14:14:43 crc kubenswrapper[4717]: I1007 14:14:43.153326 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-m6bpx" podStartSLOduration=3.153309534 podStartE2EDuration="3.153309534s" podCreationTimestamp="2025-10-07 14:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:43.149220852 +0000 UTC m=+1264.977146654" watchObservedRunningTime="2025-10-07 14:14:43.153309534 +0000 UTC m=+1264.981235326" Oct 07 14:14:43 crc kubenswrapper[4717]: I1007 14:14:43.819143 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 14:14:43 crc kubenswrapper[4717]: I1007 14:14:43.828569 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:45 crc kubenswrapper[4717]: I1007 14:14:45.135381 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a7467c-3d2c-4946-970b-ac3c5ddef8e5","Type":"ContainerStarted","Data":"9a014f543764c3718bef450a8b1a2daadd17a7c3110bb3d6fea33c16eac7b62e"} Oct 07 14:14:45 crc kubenswrapper[4717]: I1007 14:14:45.140970 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9724b0f-14b3-435d-9b32-a2b9f862ae67","Type":"ContainerStarted","Data":"5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781"} Oct 07 14:14:45 crc kubenswrapper[4717]: I1007 14:14:45.142753 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdb35360-6956-47bd-956b-1d17e8c72024","Type":"ContainerStarted","Data":"c4b247182175d5c2daa2eac6e3b519f8495244bbffb48463bbf11859dde1a8dd"} Oct 07 14:14:45 crc kubenswrapper[4717]: I1007 14:14:45.142941 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="fdb35360-6956-47bd-956b-1d17e8c72024" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c4b247182175d5c2daa2eac6e3b519f8495244bbffb48463bbf11859dde1a8dd" gracePeriod=30 Oct 07 14:14:45 crc kubenswrapper[4717]: I1007 14:14:45.147185 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335","Type":"ContainerStarted","Data":"7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169"} Oct 07 14:14:45 crc kubenswrapper[4717]: I1007 14:14:45.161401 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.107010838 podStartE2EDuration="5.161376265s" podCreationTimestamp="2025-10-07 14:14:40 +0000 UTC" firstStartedPulling="2025-10-07 14:14:41.280708827 +0000 UTC m=+1263.108634629" lastFinishedPulling="2025-10-07 14:14:44.335074264 +0000 UTC m=+1266.163000056" observedRunningTime="2025-10-07 14:14:45.154625159 +0000 UTC m=+1266.982550951" watchObservedRunningTime="2025-10-07 14:14:45.161376265 +0000 UTC m=+1266.989302057" Oct 07 14:14:45 crc kubenswrapper[4717]: I1007 14:14:45.191377 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.164343924 podStartE2EDuration="5.191352979s" podCreationTimestamp="2025-10-07 14:14:40 +0000 UTC" firstStartedPulling="2025-10-07 14:14:41.307629857 +0000 UTC m=+1263.135555649" lastFinishedPulling="2025-10-07 14:14:44.334638902 +0000 UTC m=+1266.162564704" observedRunningTime="2025-10-07 14:14:45.178315421 +0000 UTC m=+1267.006241213" watchObservedRunningTime="2025-10-07 14:14:45.191352979 +0000 UTC m=+1267.019278771" Oct 07 14:14:45 crc kubenswrapper[4717]: I1007 14:14:45.557668 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 14:14:45 crc kubenswrapper[4717]: I1007 14:14:45.636957 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.160780 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9724b0f-14b3-435d-9b32-a2b9f862ae67","Type":"ContainerStarted","Data":"b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e"} Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.163378 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" containerName="nova-metadata-log" containerID="cri-o://7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169" gracePeriod=30 Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.163457 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335","Type":"ContainerStarted","Data":"01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408"} Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.163760 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" containerName="nova-metadata-metadata" containerID="cri-o://01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408" gracePeriod=30 Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.193813 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.895606931 podStartE2EDuration="7.19379057s" podCreationTimestamp="2025-10-07 14:14:39 +0000 UTC" firstStartedPulling="2025-10-07 14:14:41.042595072 +0000 UTC m=+1262.870520864" lastFinishedPulling="2025-10-07 14:14:44.340778711 +0000 UTC m=+1266.168704503" observedRunningTime="2025-10-07 14:14:46.183330842 +0000 UTC m=+1268.011256644" watchObservedRunningTime="2025-10-07 14:14:46.19379057 +0000 UTC m=+1268.021716362" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.202721 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.151217037 podStartE2EDuration="6.202701115s" podCreationTimestamp="2025-10-07 14:14:40 +0000 UTC" firstStartedPulling="2025-10-07 14:14:41.282952449 +0000 UTC m=+1263.110878241" lastFinishedPulling="2025-10-07 14:14:44.334436527 +0000 UTC m=+1266.162362319" observedRunningTime="2025-10-07 14:14:46.200806522 +0000 UTC m=+1268.028732334" watchObservedRunningTime="2025-10-07 14:14:46.202701115 +0000 UTC m=+1268.030626917" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.760445 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.831075 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-logs\") pod \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.831182 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hbg4\" (UniqueName: \"kubernetes.io/projected/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-kube-api-access-6hbg4\") pod \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.831212 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-combined-ca-bundle\") pod \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.831302 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-config-data\") pod \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\" (UID: \"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335\") " Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.832517 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-logs" (OuterVolumeSpecName: "logs") pod "f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" (UID: "f3b3624b-b7e5-4b58-aa55-f95ac5a9e335"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.851386 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-kube-api-access-6hbg4" (OuterVolumeSpecName: "kube-api-access-6hbg4") pod "f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" (UID: "f3b3624b-b7e5-4b58-aa55-f95ac5a9e335"). InnerVolumeSpecName "kube-api-access-6hbg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.865193 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-config-data" (OuterVolumeSpecName: "config-data") pod "f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" (UID: "f3b3624b-b7e5-4b58-aa55-f95ac5a9e335"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.865707 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" (UID: "f3b3624b-b7e5-4b58-aa55-f95ac5a9e335"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.932682 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.932712 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hbg4\" (UniqueName: \"kubernetes.io/projected/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-kube-api-access-6hbg4\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.932722 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:46 crc kubenswrapper[4717]: I1007 14:14:46.932734 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.174488 4717 generic.go:334] "Generic (PLEG): container finished" podID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" containerID="01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408" exitCode=0 Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.174519 4717 generic.go:334] "Generic (PLEG): container finished" podID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" containerID="7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169" exitCode=143 Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.174554 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.174605 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335","Type":"ContainerDied","Data":"01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408"} Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.174630 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335","Type":"ContainerDied","Data":"7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169"} Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.174639 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f3b3624b-b7e5-4b58-aa55-f95ac5a9e335","Type":"ContainerDied","Data":"feebf2e3cdab2626d5ce62dd0dc8b4da0b31856f0eb5964504c52b42f7e34ab1"} Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.174653 4717 scope.go:117] "RemoveContainer" containerID="01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.202146 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.211136 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.225215 4717 scope.go:117] "RemoveContainer" containerID="7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.235324 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:47 crc kubenswrapper[4717]: E1007 14:14:47.235844 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" containerName="nova-metadata-metadata" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.235866 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" containerName="nova-metadata-metadata" Oct 07 14:14:47 crc kubenswrapper[4717]: E1007 14:14:47.235897 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" containerName="nova-metadata-log" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.235906 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" containerName="nova-metadata-log" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.236180 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" containerName="nova-metadata-log" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.236229 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" containerName="nova-metadata-metadata" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.237972 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.240409 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.240722 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.248270 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.254383 4717 scope.go:117] "RemoveContainer" containerID="01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408" Oct 07 14:14:47 crc kubenswrapper[4717]: E1007 14:14:47.257124 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408\": container with ID starting with 01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408 not found: ID does not exist" containerID="01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.257164 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408"} err="failed to get container status \"01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408\": rpc error: code = NotFound desc = could not find container \"01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408\": container with ID starting with 01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408 not found: ID does not exist" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.257187 4717 scope.go:117] "RemoveContainer" containerID="7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169" Oct 07 14:14:47 crc kubenswrapper[4717]: E1007 14:14:47.257427 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169\": container with ID starting with 7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169 not found: ID does not exist" containerID="7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.257450 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169"} err="failed to get container status \"7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169\": rpc error: code = NotFound desc = could not find container \"7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169\": container with ID starting with 7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169 not found: ID does not exist" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.257467 4717 scope.go:117] "RemoveContainer" containerID="01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.257651 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408"} err="failed to get container status \"01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408\": rpc error: code = NotFound desc = could not find container \"01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408\": container with ID starting with 01932d5794505cdda5198fa10556caf92c394ec954d0ed19aaac68be416e5408 not found: ID does not exist" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.257672 4717 scope.go:117] "RemoveContainer" containerID="7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.257836 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169"} err="failed to get container status \"7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169\": rpc error: code = NotFound desc = could not find container \"7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169\": container with ID starting with 7818dda59cc27f0e68d77addfc8cc777682df17bc46c55d496f145bbae7f0169 not found: ID does not exist" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.339792 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.339847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-logs\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.339896 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.339918 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-config-data\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.340017 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nznzs\" (UniqueName: \"kubernetes.io/projected/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-kube-api-access-nznzs\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.441443 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.441748 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-config-data\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.441857 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nznzs\" (UniqueName: \"kubernetes.io/projected/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-kube-api-access-nznzs\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.441977 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.442035 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-logs\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.442493 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-logs\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.455169 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-config-data\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.456326 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.456447 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.471998 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nznzs\" (UniqueName: \"kubernetes.io/projected/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-kube-api-access-nznzs\") pod \"nova-metadata-0\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.557535 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:14:47 crc kubenswrapper[4717]: I1007 14:14:47.989411 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:47 crc kubenswrapper[4717]: W1007 14:14:47.989516 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b9530b6_0c14_47ae_9d54_56e2f7f3d450.slice/crio-883a57e24d0391c27fcf2da97b8d046446e83c47fb19041a408750b7eb5fc452 WatchSource:0}: Error finding container 883a57e24d0391c27fcf2da97b8d046446e83c47fb19041a408750b7eb5fc452: Status 404 returned error can't find the container with id 883a57e24d0391c27fcf2da97b8d046446e83c47fb19041a408750b7eb5fc452 Oct 07 14:14:48 crc kubenswrapper[4717]: I1007 14:14:48.188243 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b9530b6-0c14-47ae-9d54-56e2f7f3d450","Type":"ContainerStarted","Data":"9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e"} Oct 07 14:14:48 crc kubenswrapper[4717]: I1007 14:14:48.188289 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b9530b6-0c14-47ae-9d54-56e2f7f3d450","Type":"ContainerStarted","Data":"883a57e24d0391c27fcf2da97b8d046446e83c47fb19041a408750b7eb5fc452"} Oct 07 14:14:48 crc kubenswrapper[4717]: I1007 14:14:48.892454 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3b3624b-b7e5-4b58-aa55-f95ac5a9e335" path="/var/lib/kubelet/pods/f3b3624b-b7e5-4b58-aa55-f95ac5a9e335/volumes" Oct 07 14:14:49 crc kubenswrapper[4717]: I1007 14:14:49.199334 4717 generic.go:334] "Generic (PLEG): container finished" podID="e4d74639-11be-451b-ad1f-56897155fc06" containerID="246d2fc2425512ceccdf324e7c83ce6724aec8ab2ee2beea75e1291fcdadefb3" exitCode=0 Oct 07 14:14:49 crc kubenswrapper[4717]: I1007 14:14:49.199487 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s4mqg" event={"ID":"e4d74639-11be-451b-ad1f-56897155fc06","Type":"ContainerDied","Data":"246d2fc2425512ceccdf324e7c83ce6724aec8ab2ee2beea75e1291fcdadefb3"} Oct 07 14:14:49 crc kubenswrapper[4717]: I1007 14:14:49.202885 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b9530b6-0c14-47ae-9d54-56e2f7f3d450","Type":"ContainerStarted","Data":"f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80"} Oct 07 14:14:49 crc kubenswrapper[4717]: I1007 14:14:49.245471 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.245445273 podStartE2EDuration="2.245445273s" podCreationTimestamp="2025-10-07 14:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:49.236185949 +0000 UTC m=+1271.064111741" watchObservedRunningTime="2025-10-07 14:14:49.245445273 +0000 UTC m=+1271.073371075" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.421851 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.422172 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.557659 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.586369 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.594183 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.607549 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-scripts\") pod \"e4d74639-11be-451b-ad1f-56897155fc06\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.607742 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-combined-ca-bundle\") pod \"e4d74639-11be-451b-ad1f-56897155fc06\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.607877 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-config-data\") pod \"e4d74639-11be-451b-ad1f-56897155fc06\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.607929 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcr78\" (UniqueName: \"kubernetes.io/projected/e4d74639-11be-451b-ad1f-56897155fc06-kube-api-access-tcr78\") pod \"e4d74639-11be-451b-ad1f-56897155fc06\" (UID: \"e4d74639-11be-451b-ad1f-56897155fc06\") " Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.619927 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4d74639-11be-451b-ad1f-56897155fc06-kube-api-access-tcr78" (OuterVolumeSpecName: "kube-api-access-tcr78") pod "e4d74639-11be-451b-ad1f-56897155fc06" (UID: "e4d74639-11be-451b-ad1f-56897155fc06"). InnerVolumeSpecName "kube-api-access-tcr78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.620297 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-scripts" (OuterVolumeSpecName: "scripts") pod "e4d74639-11be-451b-ad1f-56897155fc06" (UID: "e4d74639-11be-451b-ad1f-56897155fc06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.633951 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.662726 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4d74639-11be-451b-ad1f-56897155fc06" (UID: "e4d74639-11be-451b-ad1f-56897155fc06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.676237 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-config-data" (OuterVolumeSpecName: "config-data") pod "e4d74639-11be-451b-ad1f-56897155fc06" (UID: "e4d74639-11be-451b-ad1f-56897155fc06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.701114 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-6k48l"] Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.704362 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" podUID="54988e9a-d6e5-46da-b117-ed0dc218d42c" containerName="dnsmasq-dns" containerID="cri-o://a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd" gracePeriod=10 Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.718875 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcr78\" (UniqueName: \"kubernetes.io/projected/e4d74639-11be-451b-ad1f-56897155fc06-kube-api-access-tcr78\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.718938 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.718956 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.718970 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4d74639-11be-451b-ad1f-56897155fc06-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:50 crc kubenswrapper[4717]: I1007 14:14:50.736442 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" podUID="54988e9a-d6e5-46da-b117-ed0dc218d42c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.179:5353: connect: connection reset by peer" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.195326 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.229490 4717 generic.go:334] "Generic (PLEG): container finished" podID="54988e9a-d6e5-46da-b117-ed0dc218d42c" containerID="a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd" exitCode=0 Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.229549 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.229570 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" event={"ID":"54988e9a-d6e5-46da-b117-ed0dc218d42c","Type":"ContainerDied","Data":"a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd"} Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.229600 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f9d689-6k48l" event={"ID":"54988e9a-d6e5-46da-b117-ed0dc218d42c","Type":"ContainerDied","Data":"ea25dd4a787b903b4037d57b089a89cec974d843cf09d4ca5b681860b07eeded"} Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.229620 4717 scope.go:117] "RemoveContainer" containerID="a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.237346 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-s4mqg" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.238428 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-s4mqg" event={"ID":"e4d74639-11be-451b-ad1f-56897155fc06","Type":"ContainerDied","Data":"cbca9c02a9ddcffc10c8c1facef48ea98fdff8ebfe6139804a054b9e0001043b"} Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.238465 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbca9c02a9ddcffc10c8c1facef48ea98fdff8ebfe6139804a054b9e0001043b" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.272095 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.279274 4717 scope.go:117] "RemoveContainer" containerID="e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.320342 4717 scope.go:117] "RemoveContainer" containerID="a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd" Oct 07 14:14:51 crc kubenswrapper[4717]: E1007 14:14:51.320716 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd\": container with ID starting with a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd not found: ID does not exist" containerID="a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.320748 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd"} err="failed to get container status \"a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd\": rpc error: code = NotFound desc = could not find container \"a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd\": container with ID starting with a89504595a20959f2d0d5bc41d7810b0d0b5eaafff83456457e7b0744c564dbd not found: ID does not exist" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.320770 4717 scope.go:117] "RemoveContainer" containerID="e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49" Oct 07 14:14:51 crc kubenswrapper[4717]: E1007 14:14:51.321063 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49\": container with ID starting with e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49 not found: ID does not exist" containerID="e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.321094 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49"} err="failed to get container status \"e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49\": rpc error: code = NotFound desc = could not find container \"e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49\": container with ID starting with e75a133d911a81627fc17f80574f65e0692073b6286db6273ff808632cc88e49 not found: ID does not exist" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.354166 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgmqz\" (UniqueName: \"kubernetes.io/projected/54988e9a-d6e5-46da-b117-ed0dc218d42c-kube-api-access-rgmqz\") pod \"54988e9a-d6e5-46da-b117-ed0dc218d42c\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.354297 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-nb\") pod \"54988e9a-d6e5-46da-b117-ed0dc218d42c\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.354372 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-swift-storage-0\") pod \"54988e9a-d6e5-46da-b117-ed0dc218d42c\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.354485 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-svc\") pod \"54988e9a-d6e5-46da-b117-ed0dc218d42c\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.354538 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-config\") pod \"54988e9a-d6e5-46da-b117-ed0dc218d42c\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.354750 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-sb\") pod \"54988e9a-d6e5-46da-b117-ed0dc218d42c\" (UID: \"54988e9a-d6e5-46da-b117-ed0dc218d42c\") " Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.361485 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54988e9a-d6e5-46da-b117-ed0dc218d42c-kube-api-access-rgmqz" (OuterVolumeSpecName: "kube-api-access-rgmqz") pod "54988e9a-d6e5-46da-b117-ed0dc218d42c" (UID: "54988e9a-d6e5-46da-b117-ed0dc218d42c"). InnerVolumeSpecName "kube-api-access-rgmqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.400740 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.400980 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerName="nova-api-log" containerID="cri-o://5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781" gracePeriod=30 Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.401328 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerName="nova-api-api" containerID="cri-o://b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e" gracePeriod=30 Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.424964 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.424964 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.426868 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.427153 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" containerName="nova-metadata-log" containerID="cri-o://9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e" gracePeriod=30 Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.427221 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" containerName="nova-metadata-metadata" containerID="cri-o://f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80" gracePeriod=30 Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.441727 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-config" (OuterVolumeSpecName: "config") pod "54988e9a-d6e5-46da-b117-ed0dc218d42c" (UID: "54988e9a-d6e5-46da-b117-ed0dc218d42c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.450848 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "54988e9a-d6e5-46da-b117-ed0dc218d42c" (UID: "54988e9a-d6e5-46da-b117-ed0dc218d42c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.457455 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgmqz\" (UniqueName: \"kubernetes.io/projected/54988e9a-d6e5-46da-b117-ed0dc218d42c-kube-api-access-rgmqz\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.457481 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.457495 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.466484 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "54988e9a-d6e5-46da-b117-ed0dc218d42c" (UID: "54988e9a-d6e5-46da-b117-ed0dc218d42c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.470445 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "54988e9a-d6e5-46da-b117-ed0dc218d42c" (UID: "54988e9a-d6e5-46da-b117-ed0dc218d42c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.494163 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "54988e9a-d6e5-46da-b117-ed0dc218d42c" (UID: "54988e9a-d6e5-46da-b117-ed0dc218d42c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.559533 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.559569 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.559579 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54988e9a-d6e5-46da-b117-ed0dc218d42c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.616989 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-6k48l"] Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.626600 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5865f9d689-6k48l"] Oct 07 14:14:51 crc kubenswrapper[4717]: I1007 14:14:51.723900 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.042912 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.171356 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-combined-ca-bundle\") pod \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.171517 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-nova-metadata-tls-certs\") pod \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.171549 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nznzs\" (UniqueName: \"kubernetes.io/projected/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-kube-api-access-nznzs\") pod \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.171699 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-logs\") pod \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.171874 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-config-data\") pod \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\" (UID: \"2b9530b6-0c14-47ae-9d54-56e2f7f3d450\") " Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.172381 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-logs" (OuterVolumeSpecName: "logs") pod "2b9530b6-0c14-47ae-9d54-56e2f7f3d450" (UID: "2b9530b6-0c14-47ae-9d54-56e2f7f3d450"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.172888 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.176109 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-kube-api-access-nznzs" (OuterVolumeSpecName: "kube-api-access-nznzs") pod "2b9530b6-0c14-47ae-9d54-56e2f7f3d450" (UID: "2b9530b6-0c14-47ae-9d54-56e2f7f3d450"). InnerVolumeSpecName "kube-api-access-nznzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.198855 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b9530b6-0c14-47ae-9d54-56e2f7f3d450" (UID: "2b9530b6-0c14-47ae-9d54-56e2f7f3d450"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.208336 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-config-data" (OuterVolumeSpecName: "config-data") pod "2b9530b6-0c14-47ae-9d54-56e2f7f3d450" (UID: "2b9530b6-0c14-47ae-9d54-56e2f7f3d450"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.223735 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2b9530b6-0c14-47ae-9d54-56e2f7f3d450" (UID: "2b9530b6-0c14-47ae-9d54-56e2f7f3d450"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.261932 4717 generic.go:334] "Generic (PLEG): container finished" podID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" containerID="f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80" exitCode=0 Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.261965 4717 generic.go:334] "Generic (PLEG): container finished" podID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" containerID="9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e" exitCode=143 Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.262037 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b9530b6-0c14-47ae-9d54-56e2f7f3d450","Type":"ContainerDied","Data":"f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80"} Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.262082 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b9530b6-0c14-47ae-9d54-56e2f7f3d450","Type":"ContainerDied","Data":"9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e"} Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.262110 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b9530b6-0c14-47ae-9d54-56e2f7f3d450","Type":"ContainerDied","Data":"883a57e24d0391c27fcf2da97b8d046446e83c47fb19041a408750b7eb5fc452"} Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.262126 4717 scope.go:117] "RemoveContainer" containerID="f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.262223 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.266487 4717 generic.go:334] "Generic (PLEG): container finished" podID="48bc633c-b89e-4eff-957b-9f2cd14038bb" containerID="69506bce28f7bce0d8a2f673a3affe898d6303fd2dc4ccf5d827f2559359c315" exitCode=0 Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.266538 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m6bpx" event={"ID":"48bc633c-b89e-4eff-957b-9f2cd14038bb","Type":"ContainerDied","Data":"69506bce28f7bce0d8a2f673a3affe898d6303fd2dc4ccf5d827f2559359c315"} Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.269909 4717 generic.go:334] "Generic (PLEG): container finished" podID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerID="5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781" exitCode=143 Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.270553 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9724b0f-14b3-435d-9b32-a2b9f862ae67","Type":"ContainerDied","Data":"5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781"} Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.274740 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.274763 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.274772 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nznzs\" (UniqueName: \"kubernetes.io/projected/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-kube-api-access-nznzs\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.274783 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9530b6-0c14-47ae-9d54-56e2f7f3d450-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.288904 4717 scope.go:117] "RemoveContainer" containerID="9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.310075 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.324223 4717 scope.go:117] "RemoveContainer" containerID="f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80" Oct 07 14:14:52 crc kubenswrapper[4717]: E1007 14:14:52.325469 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80\": container with ID starting with f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80 not found: ID does not exist" containerID="f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.325509 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80"} err="failed to get container status \"f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80\": rpc error: code = NotFound desc = could not find container \"f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80\": container with ID starting with f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80 not found: ID does not exist" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.325535 4717 scope.go:117] "RemoveContainer" containerID="9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e" Oct 07 14:14:52 crc kubenswrapper[4717]: E1007 14:14:52.325775 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e\": container with ID starting with 9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e not found: ID does not exist" containerID="9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.325797 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e"} err="failed to get container status \"9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e\": rpc error: code = NotFound desc = could not find container \"9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e\": container with ID starting with 9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e not found: ID does not exist" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.325819 4717 scope.go:117] "RemoveContainer" containerID="f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.326172 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80"} err="failed to get container status \"f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80\": rpc error: code = NotFound desc = could not find container \"f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80\": container with ID starting with f6b45f3bd91b81c1c8e69ee045fb67a56f892227c6753a3b8dbc4036bd47ea80 not found: ID does not exist" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.326195 4717 scope.go:117] "RemoveContainer" containerID="9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.326377 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e"} err="failed to get container status \"9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e\": rpc error: code = NotFound desc = could not find container \"9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e\": container with ID starting with 9e69b7516b8dd35f821328eb12726a8dedb2796d095ba0a4c44636067361778e not found: ID does not exist" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.330735 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.338962 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:52 crc kubenswrapper[4717]: E1007 14:14:52.339526 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4d74639-11be-451b-ad1f-56897155fc06" containerName="nova-manage" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.339538 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4d74639-11be-451b-ad1f-56897155fc06" containerName="nova-manage" Oct 07 14:14:52 crc kubenswrapper[4717]: E1007 14:14:52.339563 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" containerName="nova-metadata-metadata" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.339570 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" containerName="nova-metadata-metadata" Oct 07 14:14:52 crc kubenswrapper[4717]: E1007 14:14:52.339586 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" containerName="nova-metadata-log" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.339594 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" containerName="nova-metadata-log" Oct 07 14:14:52 crc kubenswrapper[4717]: E1007 14:14:52.339605 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54988e9a-d6e5-46da-b117-ed0dc218d42c" containerName="dnsmasq-dns" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.339611 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="54988e9a-d6e5-46da-b117-ed0dc218d42c" containerName="dnsmasq-dns" Oct 07 14:14:52 crc kubenswrapper[4717]: E1007 14:14:52.339630 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54988e9a-d6e5-46da-b117-ed0dc218d42c" containerName="init" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.339635 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="54988e9a-d6e5-46da-b117-ed0dc218d42c" containerName="init" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.339829 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="54988e9a-d6e5-46da-b117-ed0dc218d42c" containerName="dnsmasq-dns" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.339845 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" containerName="nova-metadata-metadata" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.339856 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" containerName="nova-metadata-log" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.339869 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4d74639-11be-451b-ad1f-56897155fc06" containerName="nova-manage" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.344231 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.354107 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.354378 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.365050 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.477651 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.477691 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.477765 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmxzw\" (UniqueName: \"kubernetes.io/projected/d55baeec-55bc-4176-a082-81433bcb0c42-kube-api-access-pmxzw\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.477836 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-config-data\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.478414 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d55baeec-55bc-4176-a082-81433bcb0c42-logs\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.579677 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.579974 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.580038 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmxzw\" (UniqueName: \"kubernetes.io/projected/d55baeec-55bc-4176-a082-81433bcb0c42-kube-api-access-pmxzw\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.580107 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-config-data\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.580141 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d55baeec-55bc-4176-a082-81433bcb0c42-logs\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.580698 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d55baeec-55bc-4176-a082-81433bcb0c42-logs\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.584682 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.584827 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.590809 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-config-data\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.601300 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmxzw\" (UniqueName: \"kubernetes.io/projected/d55baeec-55bc-4176-a082-81433bcb0c42-kube-api-access-pmxzw\") pod \"nova-metadata-0\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " pod="openstack/nova-metadata-0" Oct 07 14:14:52 crc kubenswrapper[4717]: I1007 14:14:52.678098 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:52.885001 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9530b6-0c14-47ae-9d54-56e2f7f3d450" path="/var/lib/kubelet/pods/2b9530b6-0c14-47ae-9d54-56e2f7f3d450/volumes" Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:52.887884 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54988e9a-d6e5-46da-b117-ed0dc218d42c" path="/var/lib/kubelet/pods/54988e9a-d6e5-46da-b117-ed0dc218d42c/volumes" Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.279824 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="46a7467c-3d2c-4946-970b-ac3c5ddef8e5" containerName="nova-scheduler-scheduler" containerID="cri-o://9a014f543764c3718bef450a8b1a2daadd17a7c3110bb3d6fea33c16eac7b62e" gracePeriod=30 Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.780606 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.781189 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:53 crc kubenswrapper[4717]: W1007 14:14:53.792720 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd55baeec_55bc_4176_a082_81433bcb0c42.slice/crio-ace1d2be5a1596b85ba54d64bc00a691ca67dbff62db777ea3a12228179664e5 WatchSource:0}: Error finding container ace1d2be5a1596b85ba54d64bc00a691ca67dbff62db777ea3a12228179664e5: Status 404 returned error can't find the container with id ace1d2be5a1596b85ba54d64bc00a691ca67dbff62db777ea3a12228179664e5 Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.821515 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-config-data\") pod \"48bc633c-b89e-4eff-957b-9f2cd14038bb\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.821595 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-scripts\") pod \"48bc633c-b89e-4eff-957b-9f2cd14038bb\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.821637 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xcq8\" (UniqueName: \"kubernetes.io/projected/48bc633c-b89e-4eff-957b-9f2cd14038bb-kube-api-access-5xcq8\") pod \"48bc633c-b89e-4eff-957b-9f2cd14038bb\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.821761 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-combined-ca-bundle\") pod \"48bc633c-b89e-4eff-957b-9f2cd14038bb\" (UID: \"48bc633c-b89e-4eff-957b-9f2cd14038bb\") " Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.827988 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-scripts" (OuterVolumeSpecName: "scripts") pod "48bc633c-b89e-4eff-957b-9f2cd14038bb" (UID: "48bc633c-b89e-4eff-957b-9f2cd14038bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.827996 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48bc633c-b89e-4eff-957b-9f2cd14038bb-kube-api-access-5xcq8" (OuterVolumeSpecName: "kube-api-access-5xcq8") pod "48bc633c-b89e-4eff-957b-9f2cd14038bb" (UID: "48bc633c-b89e-4eff-957b-9f2cd14038bb"). InnerVolumeSpecName "kube-api-access-5xcq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.854411 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-config-data" (OuterVolumeSpecName: "config-data") pod "48bc633c-b89e-4eff-957b-9f2cd14038bb" (UID: "48bc633c-b89e-4eff-957b-9f2cd14038bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.854571 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48bc633c-b89e-4eff-957b-9f2cd14038bb" (UID: "48bc633c-b89e-4eff-957b-9f2cd14038bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.923902 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.923950 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.923968 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xcq8\" (UniqueName: \"kubernetes.io/projected/48bc633c-b89e-4eff-957b-9f2cd14038bb-kube-api-access-5xcq8\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:53 crc kubenswrapper[4717]: I1007 14:14:53.923984 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48bc633c-b89e-4eff-957b-9f2cd14038bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.290043 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-m6bpx" event={"ID":"48bc633c-b89e-4eff-957b-9f2cd14038bb","Type":"ContainerDied","Data":"c91c295c7280bc9af26b319191bca25e1f3a41a28a7278946b382c327387c643"} Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.290320 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c91c295c7280bc9af26b319191bca25e1f3a41a28a7278946b382c327387c643" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.290077 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-m6bpx" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.292978 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55baeec-55bc-4176-a082-81433bcb0c42","Type":"ContainerStarted","Data":"f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1"} Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.293038 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55baeec-55bc-4176-a082-81433bcb0c42","Type":"ContainerStarted","Data":"db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c"} Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.293052 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55baeec-55bc-4176-a082-81433bcb0c42","Type":"ContainerStarted","Data":"ace1d2be5a1596b85ba54d64bc00a691ca67dbff62db777ea3a12228179664e5"} Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.325713 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.325694761 podStartE2EDuration="2.325694761s" podCreationTimestamp="2025-10-07 14:14:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:54.314078691 +0000 UTC m=+1276.142004483" watchObservedRunningTime="2025-10-07 14:14:54.325694761 +0000 UTC m=+1276.153620553" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.371073 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 14:14:54 crc kubenswrapper[4717]: E1007 14:14:54.371789 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bc633c-b89e-4eff-957b-9f2cd14038bb" containerName="nova-cell1-conductor-db-sync" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.371965 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bc633c-b89e-4eff-957b-9f2cd14038bb" containerName="nova-cell1-conductor-db-sync" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.372252 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bc633c-b89e-4eff-957b-9f2cd14038bb" containerName="nova-cell1-conductor-db-sync" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.372989 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.375078 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.395342 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.435157 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3e722a-09cb-4b09-856e-b4752de9e30e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d3e722a-09cb-4b09-856e-b4752de9e30e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.435221 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzf2l\" (UniqueName: \"kubernetes.io/projected/6d3e722a-09cb-4b09-856e-b4752de9e30e-kube-api-access-qzf2l\") pod \"nova-cell1-conductor-0\" (UID: \"6d3e722a-09cb-4b09-856e-b4752de9e30e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.435278 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d3e722a-09cb-4b09-856e-b4752de9e30e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d3e722a-09cb-4b09-856e-b4752de9e30e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.537236 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3e722a-09cb-4b09-856e-b4752de9e30e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d3e722a-09cb-4b09-856e-b4752de9e30e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.537302 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzf2l\" (UniqueName: \"kubernetes.io/projected/6d3e722a-09cb-4b09-856e-b4752de9e30e-kube-api-access-qzf2l\") pod \"nova-cell1-conductor-0\" (UID: \"6d3e722a-09cb-4b09-856e-b4752de9e30e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.537356 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d3e722a-09cb-4b09-856e-b4752de9e30e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d3e722a-09cb-4b09-856e-b4752de9e30e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.542228 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3e722a-09cb-4b09-856e-b4752de9e30e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d3e722a-09cb-4b09-856e-b4752de9e30e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.542854 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d3e722a-09cb-4b09-856e-b4752de9e30e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d3e722a-09cb-4b09-856e-b4752de9e30e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.552949 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzf2l\" (UniqueName: \"kubernetes.io/projected/6d3e722a-09cb-4b09-856e-b4752de9e30e-kube-api-access-qzf2l\") pod \"nova-cell1-conductor-0\" (UID: \"6d3e722a-09cb-4b09-856e-b4752de9e30e\") " pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:54 crc kubenswrapper[4717]: I1007 14:14:54.696775 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:55 crc kubenswrapper[4717]: I1007 14:14:55.135967 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 07 14:14:55 crc kubenswrapper[4717]: I1007 14:14:55.301909 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6d3e722a-09cb-4b09-856e-b4752de9e30e","Type":"ContainerStarted","Data":"1b0cbb91155b96ce65fa7a4dc6acbd23f4d5833f51eb2af02e966c238be1589b"} Oct 07 14:14:55 crc kubenswrapper[4717]: E1007 14:14:55.559599 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a014f543764c3718bef450a8b1a2daadd17a7c3110bb3d6fea33c16eac7b62e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 14:14:55 crc kubenswrapper[4717]: E1007 14:14:55.561286 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a014f543764c3718bef450a8b1a2daadd17a7c3110bb3d6fea33c16eac7b62e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 14:14:55 crc kubenswrapper[4717]: E1007 14:14:55.562825 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9a014f543764c3718bef450a8b1a2daadd17a7c3110bb3d6fea33c16eac7b62e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 14:14:55 crc kubenswrapper[4717]: E1007 14:14:55.562859 4717 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="46a7467c-3d2c-4946-970b-ac3c5ddef8e5" containerName="nova-scheduler-scheduler" Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.317188 4717 generic.go:334] "Generic (PLEG): container finished" podID="46a7467c-3d2c-4946-970b-ac3c5ddef8e5" containerID="9a014f543764c3718bef450a8b1a2daadd17a7c3110bb3d6fea33c16eac7b62e" exitCode=0 Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.317279 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a7467c-3d2c-4946-970b-ac3c5ddef8e5","Type":"ContainerDied","Data":"9a014f543764c3718bef450a8b1a2daadd17a7c3110bb3d6fea33c16eac7b62e"} Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.319290 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6d3e722a-09cb-4b09-856e-b4752de9e30e","Type":"ContainerStarted","Data":"c94159b73d9c4375f117274093a882d53b9feed6f8848a8293b690ec818a9198"} Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.319397 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.341890 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.341869744 podStartE2EDuration="2.341869744s" podCreationTimestamp="2025-10-07 14:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:56.336877907 +0000 UTC m=+1278.164803709" watchObservedRunningTime="2025-10-07 14:14:56.341869744 +0000 UTC m=+1278.169795536" Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.640045 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.677053 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-config-data\") pod \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.677274 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j44dx\" (UniqueName: \"kubernetes.io/projected/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-kube-api-access-j44dx\") pod \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.677353 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-combined-ca-bundle\") pod \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\" (UID: \"46a7467c-3d2c-4946-970b-ac3c5ddef8e5\") " Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.684101 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-kube-api-access-j44dx" (OuterVolumeSpecName: "kube-api-access-j44dx") pod "46a7467c-3d2c-4946-970b-ac3c5ddef8e5" (UID: "46a7467c-3d2c-4946-970b-ac3c5ddef8e5"). InnerVolumeSpecName "kube-api-access-j44dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.711796 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46a7467c-3d2c-4946-970b-ac3c5ddef8e5" (UID: "46a7467c-3d2c-4946-970b-ac3c5ddef8e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.712783 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-config-data" (OuterVolumeSpecName: "config-data") pod "46a7467c-3d2c-4946-970b-ac3c5ddef8e5" (UID: "46a7467c-3d2c-4946-970b-ac3c5ddef8e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.779440 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.779476 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j44dx\" (UniqueName: \"kubernetes.io/projected/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-kube-api-access-j44dx\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:56 crc kubenswrapper[4717]: I1007 14:14:56.779486 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46a7467c-3d2c-4946-970b-ac3c5ddef8e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.174840 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.288919 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.336610 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"46a7467c-3d2c-4946-970b-ac3c5ddef8e5","Type":"ContainerDied","Data":"405e9329396969aebcc739c1f2537337bc28b70a32d91edc5177e12869088d93"} Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.336666 4717 scope.go:117] "RemoveContainer" containerID="9a014f543764c3718bef450a8b1a2daadd17a7c3110bb3d6fea33c16eac7b62e" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.336803 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.347214 4717 generic.go:334] "Generic (PLEG): container finished" podID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerID="b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e" exitCode=0 Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.348108 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9724b0f-14b3-435d-9b32-a2b9f862ae67","Type":"ContainerDied","Data":"b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e"} Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.348165 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a9724b0f-14b3-435d-9b32-a2b9f862ae67","Type":"ContainerDied","Data":"989a1fcc60b24ed7efc0717d27c44cf96508de701346745f5f07634d5e1d3df2"} Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.350614 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.387532 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.397848 4717 scope.go:117] "RemoveContainer" containerID="b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.397996 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7whf\" (UniqueName: \"kubernetes.io/projected/a9724b0f-14b3-435d-9b32-a2b9f862ae67-kube-api-access-h7whf\") pod \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.398247 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-combined-ca-bundle\") pod \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.398288 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-config-data\") pod \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.398370 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9724b0f-14b3-435d-9b32-a2b9f862ae67-logs\") pod \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\" (UID: \"a9724b0f-14b3-435d-9b32-a2b9f862ae67\") " Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.399752 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9724b0f-14b3-435d-9b32-a2b9f862ae67-logs" (OuterVolumeSpecName: "logs") pod "a9724b0f-14b3-435d-9b32-a2b9f862ae67" (UID: "a9724b0f-14b3-435d-9b32-a2b9f862ae67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.400418 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.411161 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9724b0f-14b3-435d-9b32-a2b9f862ae67-kube-api-access-h7whf" (OuterVolumeSpecName: "kube-api-access-h7whf") pod "a9724b0f-14b3-435d-9b32-a2b9f862ae67" (UID: "a9724b0f-14b3-435d-9b32-a2b9f862ae67"). InnerVolumeSpecName "kube-api-access-h7whf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.420061 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:14:57 crc kubenswrapper[4717]: E1007 14:14:57.420706 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerName="nova-api-log" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.420724 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerName="nova-api-log" Oct 07 14:14:57 crc kubenswrapper[4717]: E1007 14:14:57.420766 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerName="nova-api-api" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.420772 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerName="nova-api-api" Oct 07 14:14:57 crc kubenswrapper[4717]: E1007 14:14:57.420785 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a7467c-3d2c-4946-970b-ac3c5ddef8e5" containerName="nova-scheduler-scheduler" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.420792 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a7467c-3d2c-4946-970b-ac3c5ddef8e5" containerName="nova-scheduler-scheduler" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.421119 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerName="nova-api-log" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.421139 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" containerName="nova-api-api" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.421160 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a7467c-3d2c-4946-970b-ac3c5ddef8e5" containerName="nova-scheduler-scheduler" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.421786 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.423997 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.430880 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.437676 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-config-data" (OuterVolumeSpecName: "config-data") pod "a9724b0f-14b3-435d-9b32-a2b9f862ae67" (UID: "a9724b0f-14b3-435d-9b32-a2b9f862ae67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.438152 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9724b0f-14b3-435d-9b32-a2b9f862ae67" (UID: "a9724b0f-14b3-435d-9b32-a2b9f862ae67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.452947 4717 scope.go:117] "RemoveContainer" containerID="5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.500919 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.500967 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-config-data\") pod \"nova-scheduler-0\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.501130 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxmv4\" (UniqueName: \"kubernetes.io/projected/82d7576f-9d78-4a98-aed1-efd8715ed214-kube-api-access-xxmv4\") pod \"nova-scheduler-0\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.501257 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7whf\" (UniqueName: \"kubernetes.io/projected/a9724b0f-14b3-435d-9b32-a2b9f862ae67-kube-api-access-h7whf\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.501272 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.501284 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9724b0f-14b3-435d-9b32-a2b9f862ae67-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.501294 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9724b0f-14b3-435d-9b32-a2b9f862ae67-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.522758 4717 scope.go:117] "RemoveContainer" containerID="b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e" Oct 07 14:14:57 crc kubenswrapper[4717]: E1007 14:14:57.524377 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e\": container with ID starting with b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e not found: ID does not exist" containerID="b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.524427 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e"} err="failed to get container status \"b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e\": rpc error: code = NotFound desc = could not find container \"b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e\": container with ID starting with b421d46dd56e0aa7f1f302391116bbac05829f1b04d322d79d5e7087c4290c0e not found: ID does not exist" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.524447 4717 scope.go:117] "RemoveContainer" containerID="5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781" Oct 07 14:14:57 crc kubenswrapper[4717]: E1007 14:14:57.524816 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781\": container with ID starting with 5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781 not found: ID does not exist" containerID="5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.524870 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781"} err="failed to get container status \"5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781\": rpc error: code = NotFound desc = could not find container \"5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781\": container with ID starting with 5d829c7e658005adbb363a236db8e5b1e9602388bc3c50335a5595d79433f781 not found: ID does not exist" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.603044 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.603094 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-config-data\") pod \"nova-scheduler-0\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.603153 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxmv4\" (UniqueName: \"kubernetes.io/projected/82d7576f-9d78-4a98-aed1-efd8715ed214-kube-api-access-xxmv4\") pod \"nova-scheduler-0\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.610602 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-config-data\") pod \"nova-scheduler-0\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.611199 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.622361 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxmv4\" (UniqueName: \"kubernetes.io/projected/82d7576f-9d78-4a98-aed1-efd8715ed214-kube-api-access-xxmv4\") pod \"nova-scheduler-0\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.679041 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.679354 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.722219 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.732993 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.742428 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.744172 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.746555 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.752686 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.807298 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzlrf\" (UniqueName: \"kubernetes.io/projected/60962cd5-2d21-41df-8b97-5b9698670307-kube-api-access-wzlrf\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.807414 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60962cd5-2d21-41df-8b97-5b9698670307-logs\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.807458 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-config-data\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.807545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.819842 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.909524 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.910258 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzlrf\" (UniqueName: \"kubernetes.io/projected/60962cd5-2d21-41df-8b97-5b9698670307-kube-api-access-wzlrf\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.910340 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60962cd5-2d21-41df-8b97-5b9698670307-logs\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.910739 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60962cd5-2d21-41df-8b97-5b9698670307-logs\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.910813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-config-data\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.916439 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-config-data\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.918133 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:57 crc kubenswrapper[4717]: I1007 14:14:57.932718 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzlrf\" (UniqueName: \"kubernetes.io/projected/60962cd5-2d21-41df-8b97-5b9698670307-kube-api-access-wzlrf\") pod \"nova-api-0\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " pod="openstack/nova-api-0" Oct 07 14:14:58 crc kubenswrapper[4717]: I1007 14:14:58.079199 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:14:58 crc kubenswrapper[4717]: I1007 14:14:58.291445 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:14:58 crc kubenswrapper[4717]: I1007 14:14:58.359247 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"82d7576f-9d78-4a98-aed1-efd8715ed214","Type":"ContainerStarted","Data":"4d0505ae852845c706e619dcb5da94c5e0f4521b094aa00f09c646c1b68323e9"} Oct 07 14:14:58 crc kubenswrapper[4717]: I1007 14:14:58.575510 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:14:58 crc kubenswrapper[4717]: I1007 14:14:58.893926 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a7467c-3d2c-4946-970b-ac3c5ddef8e5" path="/var/lib/kubelet/pods/46a7467c-3d2c-4946-970b-ac3c5ddef8e5/volumes" Oct 07 14:14:58 crc kubenswrapper[4717]: I1007 14:14:58.898268 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9724b0f-14b3-435d-9b32-a2b9f862ae67" path="/var/lib/kubelet/pods/a9724b0f-14b3-435d-9b32-a2b9f862ae67/volumes" Oct 07 14:14:59 crc kubenswrapper[4717]: I1007 14:14:59.383726 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"82d7576f-9d78-4a98-aed1-efd8715ed214","Type":"ContainerStarted","Data":"3dcdbf3bf174ebf44a382f766610e5d4b392608179a4bc204330193c0bc02b50"} Oct 07 14:14:59 crc kubenswrapper[4717]: I1007 14:14:59.390617 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60962cd5-2d21-41df-8b97-5b9698670307","Type":"ContainerStarted","Data":"721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd"} Oct 07 14:14:59 crc kubenswrapper[4717]: I1007 14:14:59.390656 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60962cd5-2d21-41df-8b97-5b9698670307","Type":"ContainerStarted","Data":"d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97"} Oct 07 14:14:59 crc kubenswrapper[4717]: I1007 14:14:59.390667 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60962cd5-2d21-41df-8b97-5b9698670307","Type":"ContainerStarted","Data":"e37fe8e89bb8a3c6ac2713ce501cd8b47a26fe2062e56dc02145572fd15f0cdf"} Oct 07 14:14:59 crc kubenswrapper[4717]: I1007 14:14:59.421625 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.421607189 podStartE2EDuration="2.421607189s" podCreationTimestamp="2025-10-07 14:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:59.411343337 +0000 UTC m=+1281.239269129" watchObservedRunningTime="2025-10-07 14:14:59.421607189 +0000 UTC m=+1281.249532991" Oct 07 14:14:59 crc kubenswrapper[4717]: I1007 14:14:59.452143 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.452121898 podStartE2EDuration="2.452121898s" podCreationTimestamp="2025-10-07 14:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:14:59.43948166 +0000 UTC m=+1281.267407452" watchObservedRunningTime="2025-10-07 14:14:59.452121898 +0000 UTC m=+1281.280047690" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.132506 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g"] Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.133994 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.136260 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.142602 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.143378 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g"] Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.162604 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-secret-volume\") pod \"collect-profiles-29330775-rh52g\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.162705 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5cmj\" (UniqueName: \"kubernetes.io/projected/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-kube-api-access-x5cmj\") pod \"collect-profiles-29330775-rh52g\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.162796 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-config-volume\") pod \"collect-profiles-29330775-rh52g\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.264099 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-secret-volume\") pod \"collect-profiles-29330775-rh52g\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.264182 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5cmj\" (UniqueName: \"kubernetes.io/projected/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-kube-api-access-x5cmj\") pod \"collect-profiles-29330775-rh52g\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.264255 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-config-volume\") pod \"collect-profiles-29330775-rh52g\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.265058 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-config-volume\") pod \"collect-profiles-29330775-rh52g\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.271117 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-secret-volume\") pod \"collect-profiles-29330775-rh52g\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.288286 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5cmj\" (UniqueName: \"kubernetes.io/projected/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-kube-api-access-x5cmj\") pod \"collect-profiles-29330775-rh52g\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.458647 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:00 crc kubenswrapper[4717]: I1007 14:15:00.896381 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g"] Oct 07 14:15:01 crc kubenswrapper[4717]: I1007 14:15:01.409434 4717 generic.go:334] "Generic (PLEG): container finished" podID="ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5" containerID="f679a4ed24c1cb504b1e198617971169051b6ad035d9747c69dfdfc6ff3426f6" exitCode=0 Oct 07 14:15:01 crc kubenswrapper[4717]: I1007 14:15:01.409495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" event={"ID":"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5","Type":"ContainerDied","Data":"f679a4ed24c1cb504b1e198617971169051b6ad035d9747c69dfdfc6ff3426f6"} Oct 07 14:15:01 crc kubenswrapper[4717]: I1007 14:15:01.409732 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" event={"ID":"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5","Type":"ContainerStarted","Data":"59beca51dcd63fa71d2a7d1ca72aa16495d8e3f5c9e84b0d782d17e690918f2f"} Oct 07 14:15:01 crc kubenswrapper[4717]: I1007 14:15:01.610083 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:15:01 crc kubenswrapper[4717]: I1007 14:15:01.610183 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:15:02 crc kubenswrapper[4717]: I1007 14:15:02.679472 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 14:15:02 crc kubenswrapper[4717]: I1007 14:15:02.680964 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 14:15:02 crc kubenswrapper[4717]: I1007 14:15:02.820671 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 14:15:02 crc kubenswrapper[4717]: I1007 14:15:02.869513 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:02 crc kubenswrapper[4717]: I1007 14:15:02.916704 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-config-volume\") pod \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " Oct 07 14:15:02 crc kubenswrapper[4717]: I1007 14:15:02.917198 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-secret-volume\") pod \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " Oct 07 14:15:02 crc kubenswrapper[4717]: I1007 14:15:02.917283 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5cmj\" (UniqueName: \"kubernetes.io/projected/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-kube-api-access-x5cmj\") pod \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\" (UID: \"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5\") " Oct 07 14:15:02 crc kubenswrapper[4717]: I1007 14:15:02.917838 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-config-volume" (OuterVolumeSpecName: "config-volume") pod "ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5" (UID: "ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:15:02 crc kubenswrapper[4717]: I1007 14:15:02.924536 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-kube-api-access-x5cmj" (OuterVolumeSpecName: "kube-api-access-x5cmj") pod "ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5" (UID: "ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5"). InnerVolumeSpecName "kube-api-access-x5cmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:02 crc kubenswrapper[4717]: I1007 14:15:02.925098 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5" (UID: "ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:03 crc kubenswrapper[4717]: I1007 14:15:03.020558 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:03 crc kubenswrapper[4717]: I1007 14:15:03.020610 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:03 crc kubenswrapper[4717]: I1007 14:15:03.020624 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5cmj\" (UniqueName: \"kubernetes.io/projected/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5-kube-api-access-x5cmj\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:03 crc kubenswrapper[4717]: I1007 14:15:03.437463 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" event={"ID":"ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5","Type":"ContainerDied","Data":"59beca51dcd63fa71d2a7d1ca72aa16495d8e3f5c9e84b0d782d17e690918f2f"} Oct 07 14:15:03 crc kubenswrapper[4717]: I1007 14:15:03.437749 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59beca51dcd63fa71d2a7d1ca72aa16495d8e3f5c9e84b0d782d17e690918f2f" Oct 07 14:15:03 crc kubenswrapper[4717]: I1007 14:15:03.437583 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g" Oct 07 14:15:03 crc kubenswrapper[4717]: I1007 14:15:03.693225 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 14:15:03 crc kubenswrapper[4717]: I1007 14:15:03.693240 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 14:15:04 crc kubenswrapper[4717]: I1007 14:15:04.721674 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 07 14:15:07 crc kubenswrapper[4717]: I1007 14:15:07.821814 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 14:15:07 crc kubenswrapper[4717]: I1007 14:15:07.848721 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 14:15:08 crc kubenswrapper[4717]: I1007 14:15:08.079994 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 14:15:08 crc kubenswrapper[4717]: I1007 14:15:08.080070 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 14:15:08 crc kubenswrapper[4717]: I1007 14:15:08.507587 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 14:15:09 crc kubenswrapper[4717]: I1007 14:15:09.162224 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="60962cd5-2d21-41df-8b97-5b9698670307" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 14:15:09 crc kubenswrapper[4717]: I1007 14:15:09.162220 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="60962cd5-2d21-41df-8b97-5b9698670307" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.211:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 07 14:15:12 crc kubenswrapper[4717]: I1007 14:15:12.686729 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 14:15:12 crc kubenswrapper[4717]: I1007 14:15:12.687523 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 14:15:12 crc kubenswrapper[4717]: I1007 14:15:12.692771 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 14:15:12 crc kubenswrapper[4717]: I1007 14:15:12.694260 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.544856 4717 generic.go:334] "Generic (PLEG): container finished" podID="fdb35360-6956-47bd-956b-1d17e8c72024" containerID="c4b247182175d5c2daa2eac6e3b519f8495244bbffb48463bbf11859dde1a8dd" exitCode=137 Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.545051 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdb35360-6956-47bd-956b-1d17e8c72024","Type":"ContainerDied","Data":"c4b247182175d5c2daa2eac6e3b519f8495244bbffb48463bbf11859dde1a8dd"} Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.545383 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"fdb35360-6956-47bd-956b-1d17e8c72024","Type":"ContainerDied","Data":"59f2d0c945b410b02f3ec7e765938253c2e0a71882b704c0c56530b080ac1ca8"} Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.545400 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59f2d0c945b410b02f3ec7e765938253c2e0a71882b704c0c56530b080ac1ca8" Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.545428 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.694896 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-config-data\") pod \"fdb35360-6956-47bd-956b-1d17e8c72024\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.694944 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-combined-ca-bundle\") pod \"fdb35360-6956-47bd-956b-1d17e8c72024\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.695335 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d4cd\" (UniqueName: \"kubernetes.io/projected/fdb35360-6956-47bd-956b-1d17e8c72024-kube-api-access-8d4cd\") pod \"fdb35360-6956-47bd-956b-1d17e8c72024\" (UID: \"fdb35360-6956-47bd-956b-1d17e8c72024\") " Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.701335 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb35360-6956-47bd-956b-1d17e8c72024-kube-api-access-8d4cd" (OuterVolumeSpecName: "kube-api-access-8d4cd") pod "fdb35360-6956-47bd-956b-1d17e8c72024" (UID: "fdb35360-6956-47bd-956b-1d17e8c72024"). InnerVolumeSpecName "kube-api-access-8d4cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.724801 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fdb35360-6956-47bd-956b-1d17e8c72024" (UID: "fdb35360-6956-47bd-956b-1d17e8c72024"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.724832 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-config-data" (OuterVolumeSpecName: "config-data") pod "fdb35360-6956-47bd-956b-1d17e8c72024" (UID: "fdb35360-6956-47bd-956b-1d17e8c72024"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.797976 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.798040 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fdb35360-6956-47bd-956b-1d17e8c72024-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:15 crc kubenswrapper[4717]: I1007 14:15:15.798053 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d4cd\" (UniqueName: \"kubernetes.io/projected/fdb35360-6956-47bd-956b-1d17e8c72024-kube-api-access-8d4cd\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.553159 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.580323 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.588583 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.602533 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 14:15:16 crc kubenswrapper[4717]: E1007 14:15:16.602942 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb35360-6956-47bd-956b-1d17e8c72024" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.602962 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb35360-6956-47bd-956b-1d17e8c72024" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 14:15:16 crc kubenswrapper[4717]: E1007 14:15:16.602979 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5" containerName="collect-profiles" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.602985 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5" containerName="collect-profiles" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.603248 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5" containerName="collect-profiles" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.603270 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb35360-6956-47bd-956b-1d17e8c72024" containerName="nova-cell1-novncproxy-novncproxy" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.603997 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.605947 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.607317 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.612791 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.617346 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.718911 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.719110 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.719383 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.719469 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.719512 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlvbl\" (UniqueName: \"kubernetes.io/projected/d4241d67-38b0-4e1c-83d6-9a0531e6d902-kube-api-access-xlvbl\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.821342 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.821669 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.821746 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.821778 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.821803 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlvbl\" (UniqueName: \"kubernetes.io/projected/d4241d67-38b0-4e1c-83d6-9a0531e6d902-kube-api-access-xlvbl\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.825700 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.825922 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.825934 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.831743 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4241d67-38b0-4e1c-83d6-9a0531e6d902-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.839427 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlvbl\" (UniqueName: \"kubernetes.io/projected/d4241d67-38b0-4e1c-83d6-9a0531e6d902-kube-api-access-xlvbl\") pod \"nova-cell1-novncproxy-0\" (UID: \"d4241d67-38b0-4e1c-83d6-9a0531e6d902\") " pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.881231 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb35360-6956-47bd-956b-1d17e8c72024" path="/var/lib/kubelet/pods/fdb35360-6956-47bd-956b-1d17e8c72024/volumes" Oct 07 14:15:16 crc kubenswrapper[4717]: I1007 14:15:16.924175 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:17 crc kubenswrapper[4717]: I1007 14:15:17.384552 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 07 14:15:17 crc kubenswrapper[4717]: W1007 14:15:17.385208 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4241d67_38b0_4e1c_83d6_9a0531e6d902.slice/crio-35688514ad1c01fc4ca46b8acb153d29055963bef20f6c80372faa70dfc0c90c WatchSource:0}: Error finding container 35688514ad1c01fc4ca46b8acb153d29055963bef20f6c80372faa70dfc0c90c: Status 404 returned error can't find the container with id 35688514ad1c01fc4ca46b8acb153d29055963bef20f6c80372faa70dfc0c90c Oct 07 14:15:17 crc kubenswrapper[4717]: I1007 14:15:17.563134 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d4241d67-38b0-4e1c-83d6-9a0531e6d902","Type":"ContainerStarted","Data":"35688514ad1c01fc4ca46b8acb153d29055963bef20f6c80372faa70dfc0c90c"} Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.083392 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.083971 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.084059 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.093321 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.573864 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d4241d67-38b0-4e1c-83d6-9a0531e6d902","Type":"ContainerStarted","Data":"b5cd8ed0e542bd8c41519003d1df1a21fc497a026ec6899ec22c2fa7b4b03516"} Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.574106 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.577549 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.606886 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.606863184 podStartE2EDuration="2.606863184s" podCreationTimestamp="2025-10-07 14:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:15:18.590439163 +0000 UTC m=+1300.418364955" watchObservedRunningTime="2025-10-07 14:15:18.606863184 +0000 UTC m=+1300.434788986" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.778078 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-hg9q8"] Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.780903 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.790857 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-hg9q8"] Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.866992 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-config\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.867094 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxpfn\" (UniqueName: \"kubernetes.io/projected/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-kube-api-access-jxpfn\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.875061 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-sb\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.875128 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-nb\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.875161 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-swift-storage-0\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.875231 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-svc\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.976786 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-sb\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.976835 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-nb\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.976862 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-swift-storage-0\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.976889 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-svc\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.976974 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-config\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.977057 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxpfn\" (UniqueName: \"kubernetes.io/projected/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-kube-api-access-jxpfn\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.977970 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-nb\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.978140 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-sb\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.978373 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-config\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.978948 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-svc\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.979293 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-swift-storage-0\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:18 crc kubenswrapper[4717]: I1007 14:15:18.997659 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxpfn\" (UniqueName: \"kubernetes.io/projected/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-kube-api-access-jxpfn\") pod \"dnsmasq-dns-6559f4fbd7-hg9q8\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:19 crc kubenswrapper[4717]: I1007 14:15:19.113400 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:19 crc kubenswrapper[4717]: I1007 14:15:19.622693 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-hg9q8"] Oct 07 14:15:20 crc kubenswrapper[4717]: I1007 14:15:20.598528 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" containerID="2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7" exitCode=0 Oct 07 14:15:20 crc kubenswrapper[4717]: I1007 14:15:20.598630 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" event={"ID":"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb","Type":"ContainerDied","Data":"2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7"} Oct 07 14:15:20 crc kubenswrapper[4717]: I1007 14:15:20.598953 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" event={"ID":"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb","Type":"ContainerStarted","Data":"4a31e9569a2607d2570b9c985eaceb5af05eedcaca2ec94386c47e65413c08c7"} Oct 07 14:15:20 crc kubenswrapper[4717]: I1007 14:15:20.863534 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:15:20 crc kubenswrapper[4717]: I1007 14:15:20.865000 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="ceilometer-central-agent" containerID="cri-o://9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d" gracePeriod=30 Oct 07 14:15:20 crc kubenswrapper[4717]: I1007 14:15:20.865118 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="ceilometer-notification-agent" containerID="cri-o://29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b" gracePeriod=30 Oct 07 14:15:20 crc kubenswrapper[4717]: I1007 14:15:20.865079 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="sg-core" containerID="cri-o://19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f" gracePeriod=30 Oct 07 14:15:20 crc kubenswrapper[4717]: I1007 14:15:20.865295 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="proxy-httpd" containerID="cri-o://cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6" gracePeriod=30 Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.464764 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.609830 4717 generic.go:334] "Generic (PLEG): container finished" podID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerID="cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6" exitCode=0 Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.610085 4717 generic.go:334] "Generic (PLEG): container finished" podID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerID="19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f" exitCode=2 Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.610151 4717 generic.go:334] "Generic (PLEG): container finished" podID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerID="9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d" exitCode=0 Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.610054 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"858c9339-f223-4890-a789-9e88ebf6d5ab","Type":"ContainerDied","Data":"cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6"} Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.610331 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"858c9339-f223-4890-a789-9e88ebf6d5ab","Type":"ContainerDied","Data":"19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f"} Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.610395 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"858c9339-f223-4890-a789-9e88ebf6d5ab","Type":"ContainerDied","Data":"9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d"} Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.612632 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" event={"ID":"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb","Type":"ContainerStarted","Data":"165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b"} Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.612825 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60962cd5-2d21-41df-8b97-5b9698670307" containerName="nova-api-log" containerID="cri-o://d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97" gracePeriod=30 Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.613090 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="60962cd5-2d21-41df-8b97-5b9698670307" containerName="nova-api-api" containerID="cri-o://721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd" gracePeriod=30 Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.613427 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.643029 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" podStartSLOduration=3.642992633 podStartE2EDuration="3.642992633s" podCreationTimestamp="2025-10-07 14:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:15:21.633641306 +0000 UTC m=+1303.461567098" watchObservedRunningTime="2025-10-07 14:15:21.642992633 +0000 UTC m=+1303.470918425" Oct 07 14:15:21 crc kubenswrapper[4717]: I1007 14:15:21.925049 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.511631 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.622251 4717 generic.go:334] "Generic (PLEG): container finished" podID="60962cd5-2d21-41df-8b97-5b9698670307" containerID="d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97" exitCode=143 Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.622293 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60962cd5-2d21-41df-8b97-5b9698670307","Type":"ContainerDied","Data":"d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97"} Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.624901 4717 generic.go:334] "Generic (PLEG): container finished" podID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerID="29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b" exitCode=0 Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.624962 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.624955 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"858c9339-f223-4890-a789-9e88ebf6d5ab","Type":"ContainerDied","Data":"29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b"} Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.625026 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"858c9339-f223-4890-a789-9e88ebf6d5ab","Type":"ContainerDied","Data":"6762ab9c17c3b8ca52772d3632daae5a68002f845ca17fb14fa5e567cec6356f"} Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.625068 4717 scope.go:117] "RemoveContainer" containerID="cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.657774 4717 scope.go:117] "RemoveContainer" containerID="19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.660467 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-config-data\") pod \"858c9339-f223-4890-a789-9e88ebf6d5ab\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.660501 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-scripts\") pod \"858c9339-f223-4890-a789-9e88ebf6d5ab\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.660520 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-run-httpd\") pod \"858c9339-f223-4890-a789-9e88ebf6d5ab\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.660555 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-ceilometer-tls-certs\") pod \"858c9339-f223-4890-a789-9e88ebf6d5ab\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.660595 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-sg-core-conf-yaml\") pod \"858c9339-f223-4890-a789-9e88ebf6d5ab\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.660662 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-log-httpd\") pod \"858c9339-f223-4890-a789-9e88ebf6d5ab\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.660692 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-combined-ca-bundle\") pod \"858c9339-f223-4890-a789-9e88ebf6d5ab\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.660709 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wqwq\" (UniqueName: \"kubernetes.io/projected/858c9339-f223-4890-a789-9e88ebf6d5ab-kube-api-access-4wqwq\") pod \"858c9339-f223-4890-a789-9e88ebf6d5ab\" (UID: \"858c9339-f223-4890-a789-9e88ebf6d5ab\") " Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.661747 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "858c9339-f223-4890-a789-9e88ebf6d5ab" (UID: "858c9339-f223-4890-a789-9e88ebf6d5ab"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.661856 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "858c9339-f223-4890-a789-9e88ebf6d5ab" (UID: "858c9339-f223-4890-a789-9e88ebf6d5ab"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.668599 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-scripts" (OuterVolumeSpecName: "scripts") pod "858c9339-f223-4890-a789-9e88ebf6d5ab" (UID: "858c9339-f223-4890-a789-9e88ebf6d5ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.669424 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858c9339-f223-4890-a789-9e88ebf6d5ab-kube-api-access-4wqwq" (OuterVolumeSpecName: "kube-api-access-4wqwq") pod "858c9339-f223-4890-a789-9e88ebf6d5ab" (UID: "858c9339-f223-4890-a789-9e88ebf6d5ab"). InnerVolumeSpecName "kube-api-access-4wqwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.694200 4717 scope.go:117] "RemoveContainer" containerID="29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.703996 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "858c9339-f223-4890-a789-9e88ebf6d5ab" (UID: "858c9339-f223-4890-a789-9e88ebf6d5ab"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.724381 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "858c9339-f223-4890-a789-9e88ebf6d5ab" (UID: "858c9339-f223-4890-a789-9e88ebf6d5ab"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.763437 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.763474 4717 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.763488 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.763513 4717 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.763526 4717 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/858c9339-f223-4890-a789-9e88ebf6d5ab-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.763537 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wqwq\" (UniqueName: \"kubernetes.io/projected/858c9339-f223-4890-a789-9e88ebf6d5ab-kube-api-access-4wqwq\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.772182 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "858c9339-f223-4890-a789-9e88ebf6d5ab" (UID: "858c9339-f223-4890-a789-9e88ebf6d5ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.799095 4717 scope.go:117] "RemoveContainer" containerID="9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.808425 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-config-data" (OuterVolumeSpecName: "config-data") pod "858c9339-f223-4890-a789-9e88ebf6d5ab" (UID: "858c9339-f223-4890-a789-9e88ebf6d5ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.856763 4717 scope.go:117] "RemoveContainer" containerID="cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6" Oct 07 14:15:22 crc kubenswrapper[4717]: E1007 14:15:22.857273 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6\": container with ID starting with cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6 not found: ID does not exist" containerID="cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.857319 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6"} err="failed to get container status \"cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6\": rpc error: code = NotFound desc = could not find container \"cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6\": container with ID starting with cb635e0b1b945faf1268f96f077dde6cdae180203f631edc22d5d3b15ab2b6a6 not found: ID does not exist" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.857345 4717 scope.go:117] "RemoveContainer" containerID="19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f" Oct 07 14:15:22 crc kubenswrapper[4717]: E1007 14:15:22.857672 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f\": container with ID starting with 19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f not found: ID does not exist" containerID="19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.857700 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f"} err="failed to get container status \"19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f\": rpc error: code = NotFound desc = could not find container \"19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f\": container with ID starting with 19a9ae29ab7f93f4cd513e763f0444cf0b9ec7bcdb05947814ac894ca954342f not found: ID does not exist" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.857720 4717 scope.go:117] "RemoveContainer" containerID="29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b" Oct 07 14:15:22 crc kubenswrapper[4717]: E1007 14:15:22.857964 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b\": container with ID starting with 29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b not found: ID does not exist" containerID="29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.857991 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b"} err="failed to get container status \"29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b\": rpc error: code = NotFound desc = could not find container \"29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b\": container with ID starting with 29648fdf43cdfab756759cce66da67872b65b7ccdbc754acbdb6ed0ba5455d9b not found: ID does not exist" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.858024 4717 scope.go:117] "RemoveContainer" containerID="9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d" Oct 07 14:15:22 crc kubenswrapper[4717]: E1007 14:15:22.858393 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d\": container with ID starting with 9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d not found: ID does not exist" containerID="9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.858422 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d"} err="failed to get container status \"9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d\": rpc error: code = NotFound desc = could not find container \"9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d\": container with ID starting with 9626e30033b4b6dc210d66682ce0bd529aa1ab2a6113798fe981d3528c27209d not found: ID does not exist" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.865343 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.865382 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/858c9339-f223-4890-a789-9e88ebf6d5ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.964556 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.974857 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.994433 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:15:22 crc kubenswrapper[4717]: E1007 14:15:22.994954 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="proxy-httpd" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.994974 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="proxy-httpd" Oct 07 14:15:22 crc kubenswrapper[4717]: E1007 14:15:22.994985 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="ceilometer-notification-agent" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.994992 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="ceilometer-notification-agent" Oct 07 14:15:22 crc kubenswrapper[4717]: E1007 14:15:22.995028 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="sg-core" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.995071 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="sg-core" Oct 07 14:15:22 crc kubenswrapper[4717]: E1007 14:15:22.995147 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="ceilometer-central-agent" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.995156 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="ceilometer-central-agent" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.995414 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="ceilometer-central-agent" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.995436 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="proxy-httpd" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.995447 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="sg-core" Oct 07 14:15:22 crc kubenswrapper[4717]: I1007 14:15:22.995464 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" containerName="ceilometer-notification-agent" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.019878 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.020275 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.027312 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.027675 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.027954 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.169829 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-run-httpd\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.169882 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd4rf\" (UniqueName: \"kubernetes.io/projected/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-kube-api-access-zd4rf\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.169910 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.169930 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.170173 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-config-data\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.170254 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-scripts\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.170475 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.170529 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-log-httpd\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.272018 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.272082 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-log-httpd\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.272138 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-run-httpd\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.272167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd4rf\" (UniqueName: \"kubernetes.io/projected/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-kube-api-access-zd4rf\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.272191 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.272206 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.272265 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-config-data\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.272285 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-scripts\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.272636 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-run-httpd\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.273178 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-log-httpd\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.276747 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-scripts\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.277026 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-config-data\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.277089 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.277611 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.281540 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.294868 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd4rf\" (UniqueName: \"kubernetes.io/projected/3c0fa622-b0b4-4100-b7fa-82e3821d1f76-kube-api-access-zd4rf\") pod \"ceilometer-0\" (UID: \"3c0fa622-b0b4-4100-b7fa-82e3821d1f76\") " pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.350597 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 07 14:15:23 crc kubenswrapper[4717]: I1007 14:15:23.804307 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 07 14:15:23 crc kubenswrapper[4717]: W1007 14:15:23.807690 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c0fa622_b0b4_4100_b7fa_82e3821d1f76.slice/crio-2d8650b1547b02aae73a7df09622a42ec29ab968218d4c6aed89f8ca75b92315 WatchSource:0}: Error finding container 2d8650b1547b02aae73a7df09622a42ec29ab968218d4c6aed89f8ca75b92315: Status 404 returned error can't find the container with id 2d8650b1547b02aae73a7df09622a42ec29ab968218d4c6aed89f8ca75b92315 Oct 07 14:15:24 crc kubenswrapper[4717]: I1007 14:15:24.643556 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c0fa622-b0b4-4100-b7fa-82e3821d1f76","Type":"ContainerStarted","Data":"2d8650b1547b02aae73a7df09622a42ec29ab968218d4c6aed89f8ca75b92315"} Oct 07 14:15:24 crc kubenswrapper[4717]: I1007 14:15:24.879377 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858c9339-f223-4890-a789-9e88ebf6d5ab" path="/var/lib/kubelet/pods/858c9339-f223-4890-a789-9e88ebf6d5ab/volumes" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.610997 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.652663 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c0fa622-b0b4-4100-b7fa-82e3821d1f76","Type":"ContainerStarted","Data":"7aefcea4b10cfb0021c9a7d2babe7b91bd9a2c055cc4ed0e0a52c52c9b108b2d"} Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.655818 4717 generic.go:334] "Generic (PLEG): container finished" podID="60962cd5-2d21-41df-8b97-5b9698670307" containerID="721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd" exitCode=0 Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.655853 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60962cd5-2d21-41df-8b97-5b9698670307","Type":"ContainerDied","Data":"721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd"} Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.655873 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"60962cd5-2d21-41df-8b97-5b9698670307","Type":"ContainerDied","Data":"e37fe8e89bb8a3c6ac2713ce501cd8b47a26fe2062e56dc02145572fd15f0cdf"} Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.655889 4717 scope.go:117] "RemoveContainer" containerID="721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.656022 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.688117 4717 scope.go:117] "RemoveContainer" containerID="d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.721096 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzlrf\" (UniqueName: \"kubernetes.io/projected/60962cd5-2d21-41df-8b97-5b9698670307-kube-api-access-wzlrf\") pod \"60962cd5-2d21-41df-8b97-5b9698670307\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.721167 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-combined-ca-bundle\") pod \"60962cd5-2d21-41df-8b97-5b9698670307\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.721203 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-config-data\") pod \"60962cd5-2d21-41df-8b97-5b9698670307\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.721314 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60962cd5-2d21-41df-8b97-5b9698670307-logs\") pod \"60962cd5-2d21-41df-8b97-5b9698670307\" (UID: \"60962cd5-2d21-41df-8b97-5b9698670307\") " Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.722970 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60962cd5-2d21-41df-8b97-5b9698670307-logs" (OuterVolumeSpecName: "logs") pod "60962cd5-2d21-41df-8b97-5b9698670307" (UID: "60962cd5-2d21-41df-8b97-5b9698670307"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.730499 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60962cd5-2d21-41df-8b97-5b9698670307-kube-api-access-wzlrf" (OuterVolumeSpecName: "kube-api-access-wzlrf") pod "60962cd5-2d21-41df-8b97-5b9698670307" (UID: "60962cd5-2d21-41df-8b97-5b9698670307"). InnerVolumeSpecName "kube-api-access-wzlrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.763163 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60962cd5-2d21-41df-8b97-5b9698670307" (UID: "60962cd5-2d21-41df-8b97-5b9698670307"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.799511 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-config-data" (OuterVolumeSpecName: "config-data") pod "60962cd5-2d21-41df-8b97-5b9698670307" (UID: "60962cd5-2d21-41df-8b97-5b9698670307"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.823630 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.823666 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60962cd5-2d21-41df-8b97-5b9698670307-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.823678 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/60962cd5-2d21-41df-8b97-5b9698670307-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.823692 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzlrf\" (UniqueName: \"kubernetes.io/projected/60962cd5-2d21-41df-8b97-5b9698670307-kube-api-access-wzlrf\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.963258 4717 scope.go:117] "RemoveContainer" containerID="721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd" Oct 07 14:15:25 crc kubenswrapper[4717]: E1007 14:15:25.963686 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd\": container with ID starting with 721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd not found: ID does not exist" containerID="721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.963716 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd"} err="failed to get container status \"721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd\": rpc error: code = NotFound desc = could not find container \"721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd\": container with ID starting with 721cac612044af20a6839e64a53918af3d86b3e722276de3ef2a3fdedb67e9cd not found: ID does not exist" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.963737 4717 scope.go:117] "RemoveContainer" containerID="d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97" Oct 07 14:15:25 crc kubenswrapper[4717]: E1007 14:15:25.964090 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97\": container with ID starting with d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97 not found: ID does not exist" containerID="d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.964136 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97"} err="failed to get container status \"d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97\": rpc error: code = NotFound desc = could not find container \"d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97\": container with ID starting with d865eec0efb627094d5605ff492bff5409b21c30e9fad684f5b87c6bdeda3d97 not found: ID does not exist" Oct 07 14:15:25 crc kubenswrapper[4717]: I1007 14:15:25.997980 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.007383 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.023339 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:26 crc kubenswrapper[4717]: E1007 14:15:26.023765 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60962cd5-2d21-41df-8b97-5b9698670307" containerName="nova-api-log" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.023788 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="60962cd5-2d21-41df-8b97-5b9698670307" containerName="nova-api-log" Oct 07 14:15:26 crc kubenswrapper[4717]: E1007 14:15:26.023807 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60962cd5-2d21-41df-8b97-5b9698670307" containerName="nova-api-api" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.023813 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="60962cd5-2d21-41df-8b97-5b9698670307" containerName="nova-api-api" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.024051 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="60962cd5-2d21-41df-8b97-5b9698670307" containerName="nova-api-log" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.024077 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="60962cd5-2d21-41df-8b97-5b9698670307" containerName="nova-api-api" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.027771 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.035658 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.036044 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.036955 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.038540 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.129123 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/033365db-a07d-4e41-a0bc-d1781bd8412d-logs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.129188 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.129224 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.129360 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-config-data\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.129449 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-public-tls-certs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.129564 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfxcs\" (UniqueName: \"kubernetes.io/projected/033365db-a07d-4e41-a0bc-d1781bd8412d-kube-api-access-bfxcs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.231201 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/033365db-a07d-4e41-a0bc-d1781bd8412d-logs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.231261 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.231302 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.231323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-config-data\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.231753 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-public-tls-certs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.231883 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/033365db-a07d-4e41-a0bc-d1781bd8412d-logs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.232062 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfxcs\" (UniqueName: \"kubernetes.io/projected/033365db-a07d-4e41-a0bc-d1781bd8412d-kube-api-access-bfxcs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.235100 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-public-tls-certs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.235132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.236393 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-config-data\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.237058 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.247984 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfxcs\" (UniqueName: \"kubernetes.io/projected/033365db-a07d-4e41-a0bc-d1781bd8412d-kube-api-access-bfxcs\") pod \"nova-api-0\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.350411 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.668299 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c0fa622-b0b4-4100-b7fa-82e3821d1f76","Type":"ContainerStarted","Data":"a93dd0075b5145e27a368c564c8b60bca4bbbd2caa78530e70878e5b39b740a2"} Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.831485 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.894832 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60962cd5-2d21-41df-8b97-5b9698670307" path="/var/lib/kubelet/pods/60962cd5-2d21-41df-8b97-5b9698670307/volumes" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.925426 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:26 crc kubenswrapper[4717]: I1007 14:15:26.947474 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:27 crc kubenswrapper[4717]: I1007 14:15:27.684767 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"033365db-a07d-4e41-a0bc-d1781bd8412d","Type":"ContainerStarted","Data":"072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae"} Oct 07 14:15:27 crc kubenswrapper[4717]: I1007 14:15:27.685024 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"033365db-a07d-4e41-a0bc-d1781bd8412d","Type":"ContainerStarted","Data":"7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5"} Oct 07 14:15:27 crc kubenswrapper[4717]: I1007 14:15:27.685036 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"033365db-a07d-4e41-a0bc-d1781bd8412d","Type":"ContainerStarted","Data":"459214c524078b651bf2fd4205e1368b20f7ea046b3435c7a35a096603975484"} Oct 07 14:15:27 crc kubenswrapper[4717]: I1007 14:15:27.704619 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 07 14:15:27 crc kubenswrapper[4717]: I1007 14:15:27.728496 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.7284729030000001 podStartE2EDuration="1.728472903s" podCreationTimestamp="2025-10-07 14:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:15:27.709727087 +0000 UTC m=+1309.537652899" watchObservedRunningTime="2025-10-07 14:15:27.728472903 +0000 UTC m=+1309.556398705" Oct 07 14:15:27 crc kubenswrapper[4717]: I1007 14:15:27.895258 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-rj7jp"] Oct 07 14:15:27 crc kubenswrapper[4717]: I1007 14:15:27.896858 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:27 crc kubenswrapper[4717]: I1007 14:15:27.905784 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 07 14:15:27 crc kubenswrapper[4717]: I1007 14:15:27.906288 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 07 14:15:27 crc kubenswrapper[4717]: I1007 14:15:27.913648 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rj7jp"] Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.078129 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.078185 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5fg7\" (UniqueName: \"kubernetes.io/projected/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-kube-api-access-z5fg7\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.078213 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-config-data\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.078454 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-scripts\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.180311 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5fg7\" (UniqueName: \"kubernetes.io/projected/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-kube-api-access-z5fg7\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.180369 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-config-data\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.180442 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-scripts\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.180550 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.184919 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-config-data\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.188508 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.195292 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-scripts\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.197299 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5fg7\" (UniqueName: \"kubernetes.io/projected/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-kube-api-access-z5fg7\") pod \"nova-cell1-cell-mapping-rj7jp\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.225768 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.698497 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c0fa622-b0b4-4100-b7fa-82e3821d1f76","Type":"ContainerStarted","Data":"8bc2b2bb11f50603443a3032c2795e109d951c375034a45837ce776f4fb7723d"} Oct 07 14:15:28 crc kubenswrapper[4717]: I1007 14:15:28.719522 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rj7jp"] Oct 07 14:15:28 crc kubenswrapper[4717]: W1007 14:15:28.724120 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod663a1e17_3611_4da4_8d88_e3f5b3c7d0b6.slice/crio-61fc1d12e02e01e7b90807d976de48992a798590bb8cb439055228ceea0bb620 WatchSource:0}: Error finding container 61fc1d12e02e01e7b90807d976de48992a798590bb8cb439055228ceea0bb620: Status 404 returned error can't find the container with id 61fc1d12e02e01e7b90807d976de48992a798590bb8cb439055228ceea0bb620 Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.115235 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.185553 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg"] Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.185868 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" podUID="2acf8450-e2d5-404b-a98c-d9efba061461" containerName="dnsmasq-dns" containerID="cri-o://9f203292f2c2f5c8988fb8077209b7b1fba52c253faac8718fd7871b2117a0b8" gracePeriod=10 Oct 07 14:15:29 crc kubenswrapper[4717]: E1007 14:15:29.413646 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2acf8450_e2d5_404b_a98c_d9efba061461.slice/crio-9f203292f2c2f5c8988fb8077209b7b1fba52c253faac8718fd7871b2117a0b8.scope\": RecentStats: unable to find data in memory cache]" Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.713685 4717 generic.go:334] "Generic (PLEG): container finished" podID="2acf8450-e2d5-404b-a98c-d9efba061461" containerID="9f203292f2c2f5c8988fb8077209b7b1fba52c253faac8718fd7871b2117a0b8" exitCode=0 Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.713745 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" event={"ID":"2acf8450-e2d5-404b-a98c-d9efba061461","Type":"ContainerDied","Data":"9f203292f2c2f5c8988fb8077209b7b1fba52c253faac8718fd7871b2117a0b8"} Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.713769 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" event={"ID":"2acf8450-e2d5-404b-a98c-d9efba061461","Type":"ContainerDied","Data":"f583b8773e8d07b3474de1529de409d3532416d433d5bac0510f3feb21961f7d"} Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.713780 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f583b8773e8d07b3474de1529de409d3532416d433d5bac0510f3feb21961f7d" Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.716376 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rj7jp" event={"ID":"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6","Type":"ContainerStarted","Data":"54c445c672165da09acda91840f544f06248e79459c9b182bb0ec480968101ed"} Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.716399 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rj7jp" event={"ID":"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6","Type":"ContainerStarted","Data":"61fc1d12e02e01e7b90807d976de48992a798590bb8cb439055228ceea0bb620"} Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.736560 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-rj7jp" podStartSLOduration=2.736540614 podStartE2EDuration="2.736540614s" podCreationTimestamp="2025-10-07 14:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:15:29.734289582 +0000 UTC m=+1311.562215374" watchObservedRunningTime="2025-10-07 14:15:29.736540614 +0000 UTC m=+1311.564466406" Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.767475 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.930294 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-config\") pod \"2acf8450-e2d5-404b-a98c-d9efba061461\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.930373 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-nb\") pod \"2acf8450-e2d5-404b-a98c-d9efba061461\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.930542 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lhw2\" (UniqueName: \"kubernetes.io/projected/2acf8450-e2d5-404b-a98c-d9efba061461-kube-api-access-8lhw2\") pod \"2acf8450-e2d5-404b-a98c-d9efba061461\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.930577 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-sb\") pod \"2acf8450-e2d5-404b-a98c-d9efba061461\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.930618 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-swift-storage-0\") pod \"2acf8450-e2d5-404b-a98c-d9efba061461\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.930668 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-svc\") pod \"2acf8450-e2d5-404b-a98c-d9efba061461\" (UID: \"2acf8450-e2d5-404b-a98c-d9efba061461\") " Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.948350 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2acf8450-e2d5-404b-a98c-d9efba061461-kube-api-access-8lhw2" (OuterVolumeSpecName: "kube-api-access-8lhw2") pod "2acf8450-e2d5-404b-a98c-d9efba061461" (UID: "2acf8450-e2d5-404b-a98c-d9efba061461"). InnerVolumeSpecName "kube-api-access-8lhw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:29 crc kubenswrapper[4717]: I1007 14:15:29.989365 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-config" (OuterVolumeSpecName: "config") pod "2acf8450-e2d5-404b-a98c-d9efba061461" (UID: "2acf8450-e2d5-404b-a98c-d9efba061461"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.005456 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2acf8450-e2d5-404b-a98c-d9efba061461" (UID: "2acf8450-e2d5-404b-a98c-d9efba061461"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.015284 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2acf8450-e2d5-404b-a98c-d9efba061461" (UID: "2acf8450-e2d5-404b-a98c-d9efba061461"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.016267 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2acf8450-e2d5-404b-a98c-d9efba061461" (UID: "2acf8450-e2d5-404b-a98c-d9efba061461"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.024244 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2acf8450-e2d5-404b-a98c-d9efba061461" (UID: "2acf8450-e2d5-404b-a98c-d9efba061461"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.036178 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.036682 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.036703 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lhw2\" (UniqueName: \"kubernetes.io/projected/2acf8450-e2d5-404b-a98c-d9efba061461-kube-api-access-8lhw2\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.036717 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.036730 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.036743 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2acf8450-e2d5-404b-a98c-d9efba061461-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.729320 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.729687 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3c0fa622-b0b4-4100-b7fa-82e3821d1f76","Type":"ContainerStarted","Data":"3fdb22286577196a2d1f11c08e815fb8496d140fd700aecf1fe79f571be68ce8"} Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.730371 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.762044 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.732326604 podStartE2EDuration="8.76202459s" podCreationTimestamp="2025-10-07 14:15:22 +0000 UTC" firstStartedPulling="2025-10-07 14:15:23.809745036 +0000 UTC m=+1305.637670828" lastFinishedPulling="2025-10-07 14:15:29.839443022 +0000 UTC m=+1311.667368814" observedRunningTime="2025-10-07 14:15:30.752060316 +0000 UTC m=+1312.579986128" watchObservedRunningTime="2025-10-07 14:15:30.76202459 +0000 UTC m=+1312.589950382" Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.774051 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg"] Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.783480 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d5fbbb8c5-fwxxg"] Oct 07 14:15:30 crc kubenswrapper[4717]: I1007 14:15:30.879615 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2acf8450-e2d5-404b-a98c-d9efba061461" path="/var/lib/kubelet/pods/2acf8450-e2d5-404b-a98c-d9efba061461/volumes" Oct 07 14:15:31 crc kubenswrapper[4717]: I1007 14:15:31.610189 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:15:31 crc kubenswrapper[4717]: I1007 14:15:31.610245 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:15:34 crc kubenswrapper[4717]: I1007 14:15:34.764406 4717 generic.go:334] "Generic (PLEG): container finished" podID="663a1e17-3611-4da4-8d88-e3f5b3c7d0b6" containerID="54c445c672165da09acda91840f544f06248e79459c9b182bb0ec480968101ed" exitCode=0 Oct 07 14:15:34 crc kubenswrapper[4717]: I1007 14:15:34.764523 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rj7jp" event={"ID":"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6","Type":"ContainerDied","Data":"54c445c672165da09acda91840f544f06248e79459c9b182bb0ec480968101ed"} Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.161361 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.262037 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-scripts\") pod \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.262090 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5fg7\" (UniqueName: \"kubernetes.io/projected/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-kube-api-access-z5fg7\") pod \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.262190 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-config-data\") pod \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.262237 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-combined-ca-bundle\") pod \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\" (UID: \"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6\") " Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.268794 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-scripts" (OuterVolumeSpecName: "scripts") pod "663a1e17-3611-4da4-8d88-e3f5b3c7d0b6" (UID: "663a1e17-3611-4da4-8d88-e3f5b3c7d0b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.268806 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-kube-api-access-z5fg7" (OuterVolumeSpecName: "kube-api-access-z5fg7") pod "663a1e17-3611-4da4-8d88-e3f5b3c7d0b6" (UID: "663a1e17-3611-4da4-8d88-e3f5b3c7d0b6"). InnerVolumeSpecName "kube-api-access-z5fg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.290837 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "663a1e17-3611-4da4-8d88-e3f5b3c7d0b6" (UID: "663a1e17-3611-4da4-8d88-e3f5b3c7d0b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.293130 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-config-data" (OuterVolumeSpecName: "config-data") pod "663a1e17-3611-4da4-8d88-e3f5b3c7d0b6" (UID: "663a1e17-3611-4da4-8d88-e3f5b3c7d0b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.351083 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.351167 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.364983 4717 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-scripts\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.365140 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5fg7\" (UniqueName: \"kubernetes.io/projected/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-kube-api-access-z5fg7\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.365156 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.365165 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.788238 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rj7jp" event={"ID":"663a1e17-3611-4da4-8d88-e3f5b3c7d0b6","Type":"ContainerDied","Data":"61fc1d12e02e01e7b90807d976de48992a798590bb8cb439055228ceea0bb620"} Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.788527 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61fc1d12e02e01e7b90807d976de48992a798590bb8cb439055228ceea0bb620" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.788310 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rj7jp" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.968077 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.968521 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="82d7576f-9d78-4a98-aed1-efd8715ed214" containerName="nova-scheduler-scheduler" containerID="cri-o://3dcdbf3bf174ebf44a382f766610e5d4b392608179a4bc204330193c0bc02b50" gracePeriod=30 Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.980756 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.981052 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerName="nova-api-log" containerID="cri-o://7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5" gracePeriod=30 Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.981199 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerName="nova-api-api" containerID="cri-o://072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae" gracePeriod=30 Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.989294 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.989635 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-log" containerID="cri-o://db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c" gracePeriod=30 Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.989899 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-metadata" containerID="cri-o://f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1" gracePeriod=30 Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.999017 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": EOF" Oct 07 14:15:36 crc kubenswrapper[4717]: I1007 14:15:36.999397 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.216:8774/\": EOF" Oct 07 14:15:37 crc kubenswrapper[4717]: I1007 14:15:37.800605 4717 generic.go:334] "Generic (PLEG): container finished" podID="d55baeec-55bc-4176-a082-81433bcb0c42" containerID="db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c" exitCode=143 Oct 07 14:15:37 crc kubenswrapper[4717]: I1007 14:15:37.800697 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55baeec-55bc-4176-a082-81433bcb0c42","Type":"ContainerDied","Data":"db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c"} Oct 07 14:15:37 crc kubenswrapper[4717]: I1007 14:15:37.802441 4717 generic.go:334] "Generic (PLEG): container finished" podID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerID="7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5" exitCode=143 Oct 07 14:15:37 crc kubenswrapper[4717]: I1007 14:15:37.802477 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"033365db-a07d-4e41-a0bc-d1781bd8412d","Type":"ContainerDied","Data":"7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5"} Oct 07 14:15:37 crc kubenswrapper[4717]: E1007 14:15:37.823146 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3dcdbf3bf174ebf44a382f766610e5d4b392608179a4bc204330193c0bc02b50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 14:15:37 crc kubenswrapper[4717]: E1007 14:15:37.824301 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3dcdbf3bf174ebf44a382f766610e5d4b392608179a4bc204330193c0bc02b50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 14:15:37 crc kubenswrapper[4717]: E1007 14:15:37.825433 4717 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3dcdbf3bf174ebf44a382f766610e5d4b392608179a4bc204330193c0bc02b50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 07 14:15:37 crc kubenswrapper[4717]: E1007 14:15:37.825462 4717 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="82d7576f-9d78-4a98-aed1-efd8715ed214" containerName="nova-scheduler-scheduler" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.116772 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": read tcp 10.217.0.2:55186->10.217.0.208:8775: read: connection reset by peer" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.116934 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": read tcp 10.217.0.2:55184->10.217.0.208:8775: read: connection reset by peer" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.605332 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.772181 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-config-data\") pod \"d55baeec-55bc-4176-a082-81433bcb0c42\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.772344 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmxzw\" (UniqueName: \"kubernetes.io/projected/d55baeec-55bc-4176-a082-81433bcb0c42-kube-api-access-pmxzw\") pod \"d55baeec-55bc-4176-a082-81433bcb0c42\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.772373 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-combined-ca-bundle\") pod \"d55baeec-55bc-4176-a082-81433bcb0c42\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.772412 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-nova-metadata-tls-certs\") pod \"d55baeec-55bc-4176-a082-81433bcb0c42\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.772528 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d55baeec-55bc-4176-a082-81433bcb0c42-logs\") pod \"d55baeec-55bc-4176-a082-81433bcb0c42\" (UID: \"d55baeec-55bc-4176-a082-81433bcb0c42\") " Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.773506 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d55baeec-55bc-4176-a082-81433bcb0c42-logs" (OuterVolumeSpecName: "logs") pod "d55baeec-55bc-4176-a082-81433bcb0c42" (UID: "d55baeec-55bc-4176-a082-81433bcb0c42"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.792205 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d55baeec-55bc-4176-a082-81433bcb0c42-kube-api-access-pmxzw" (OuterVolumeSpecName: "kube-api-access-pmxzw") pod "d55baeec-55bc-4176-a082-81433bcb0c42" (UID: "d55baeec-55bc-4176-a082-81433bcb0c42"). InnerVolumeSpecName "kube-api-access-pmxzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.816521 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-config-data" (OuterVolumeSpecName: "config-data") pod "d55baeec-55bc-4176-a082-81433bcb0c42" (UID: "d55baeec-55bc-4176-a082-81433bcb0c42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.848198 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d55baeec-55bc-4176-a082-81433bcb0c42" (UID: "d55baeec-55bc-4176-a082-81433bcb0c42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.862897 4717 generic.go:334] "Generic (PLEG): container finished" podID="d55baeec-55bc-4176-a082-81433bcb0c42" containerID="f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1" exitCode=0 Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.863239 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55baeec-55bc-4176-a082-81433bcb0c42","Type":"ContainerDied","Data":"f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1"} Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.863317 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55baeec-55bc-4176-a082-81433bcb0c42","Type":"ContainerDied","Data":"ace1d2be5a1596b85ba54d64bc00a691ca67dbff62db777ea3a12228179664e5"} Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.863385 4717 scope.go:117] "RemoveContainer" containerID="f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.863557 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.870681 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d55baeec-55bc-4176-a082-81433bcb0c42" (UID: "d55baeec-55bc-4176-a082-81433bcb0c42"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.875743 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.875776 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmxzw\" (UniqueName: \"kubernetes.io/projected/d55baeec-55bc-4176-a082-81433bcb0c42-kube-api-access-pmxzw\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.875786 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.875795 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d55baeec-55bc-4176-a082-81433bcb0c42-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.875806 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d55baeec-55bc-4176-a082-81433bcb0c42-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.923462 4717 scope.go:117] "RemoveContainer" containerID="db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.950871 4717 scope.go:117] "RemoveContainer" containerID="f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1" Oct 07 14:15:40 crc kubenswrapper[4717]: E1007 14:15:40.951404 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1\": container with ID starting with f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1 not found: ID does not exist" containerID="f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.951465 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1"} err="failed to get container status \"f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1\": rpc error: code = NotFound desc = could not find container \"f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1\": container with ID starting with f0b3c59b43485b019fed1317834bc1313e3ba40171567b7aaea4c3d8c36666f1 not found: ID does not exist" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.951500 4717 scope.go:117] "RemoveContainer" containerID="db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c" Oct 07 14:15:40 crc kubenswrapper[4717]: E1007 14:15:40.951879 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c\": container with ID starting with db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c not found: ID does not exist" containerID="db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c" Oct 07 14:15:40 crc kubenswrapper[4717]: I1007 14:15:40.951922 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c"} err="failed to get container status \"db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c\": rpc error: code = NotFound desc = could not find container \"db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c\": container with ID starting with db5ac451f904c4a73a5c5451c3e2d757f7a4590dea6bfd42964b816e973b859c not found: ID does not exist" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.187587 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.197403 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.207223 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:15:41 crc kubenswrapper[4717]: E1007 14:15:41.207756 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acf8450-e2d5-404b-a98c-d9efba061461" containerName="init" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.207776 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acf8450-e2d5-404b-a98c-d9efba061461" containerName="init" Oct 07 14:15:41 crc kubenswrapper[4717]: E1007 14:15:41.207791 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-log" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.207798 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-log" Oct 07 14:15:41 crc kubenswrapper[4717]: E1007 14:15:41.207815 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acf8450-e2d5-404b-a98c-d9efba061461" containerName="dnsmasq-dns" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.207822 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acf8450-e2d5-404b-a98c-d9efba061461" containerName="dnsmasq-dns" Oct 07 14:15:41 crc kubenswrapper[4717]: E1007 14:15:41.207836 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="663a1e17-3611-4da4-8d88-e3f5b3c7d0b6" containerName="nova-manage" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.207844 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="663a1e17-3611-4da4-8d88-e3f5b3c7d0b6" containerName="nova-manage" Oct 07 14:15:41 crc kubenswrapper[4717]: E1007 14:15:41.207868 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-metadata" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.207876 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-metadata" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.208095 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="663a1e17-3611-4da4-8d88-e3f5b3c7d0b6" containerName="nova-manage" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.208114 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="2acf8450-e2d5-404b-a98c-d9efba061461" containerName="dnsmasq-dns" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.208136 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-log" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.208160 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" containerName="nova-metadata-metadata" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.209162 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.211189 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.211427 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.217617 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.388570 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c62351c-f8eb-4014-8391-229d81411849-config-data\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.388660 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c62351c-f8eb-4014-8391-229d81411849-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.388741 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c62351c-f8eb-4014-8391-229d81411849-logs\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.388770 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76mw7\" (UniqueName: \"kubernetes.io/projected/0c62351c-f8eb-4014-8391-229d81411849-kube-api-access-76mw7\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.388891 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c62351c-f8eb-4014-8391-229d81411849-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.490627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c62351c-f8eb-4014-8391-229d81411849-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.490882 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c62351c-f8eb-4014-8391-229d81411849-logs\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.490902 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76mw7\" (UniqueName: \"kubernetes.io/projected/0c62351c-f8eb-4014-8391-229d81411849-kube-api-access-76mw7\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.490998 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c62351c-f8eb-4014-8391-229d81411849-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.491123 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c62351c-f8eb-4014-8391-229d81411849-config-data\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.491542 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c62351c-f8eb-4014-8391-229d81411849-logs\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.495401 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c62351c-f8eb-4014-8391-229d81411849-config-data\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.495984 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c62351c-f8eb-4014-8391-229d81411849-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.495991 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c62351c-f8eb-4014-8391-229d81411849-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.523210 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76mw7\" (UniqueName: \"kubernetes.io/projected/0c62351c-f8eb-4014-8391-229d81411849-kube-api-access-76mw7\") pod \"nova-metadata-0\" (UID: \"0c62351c-f8eb-4014-8391-229d81411849\") " pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.553717 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.879966 4717 generic.go:334] "Generic (PLEG): container finished" podID="82d7576f-9d78-4a98-aed1-efd8715ed214" containerID="3dcdbf3bf174ebf44a382f766610e5d4b392608179a4bc204330193c0bc02b50" exitCode=0 Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.880042 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"82d7576f-9d78-4a98-aed1-efd8715ed214","Type":"ContainerDied","Data":"3dcdbf3bf174ebf44a382f766610e5d4b392608179a4bc204330193c0bc02b50"} Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.978885 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 07 14:15:41 crc kubenswrapper[4717]: I1007 14:15:41.978936 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.102890 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxmv4\" (UniqueName: \"kubernetes.io/projected/82d7576f-9d78-4a98-aed1-efd8715ed214-kube-api-access-xxmv4\") pod \"82d7576f-9d78-4a98-aed1-efd8715ed214\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.103297 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-config-data\") pod \"82d7576f-9d78-4a98-aed1-efd8715ed214\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.103335 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-combined-ca-bundle\") pod \"82d7576f-9d78-4a98-aed1-efd8715ed214\" (UID: \"82d7576f-9d78-4a98-aed1-efd8715ed214\") " Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.107967 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d7576f-9d78-4a98-aed1-efd8715ed214-kube-api-access-xxmv4" (OuterVolumeSpecName: "kube-api-access-xxmv4") pod "82d7576f-9d78-4a98-aed1-efd8715ed214" (UID: "82d7576f-9d78-4a98-aed1-efd8715ed214"). InnerVolumeSpecName "kube-api-access-xxmv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.139433 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82d7576f-9d78-4a98-aed1-efd8715ed214" (UID: "82d7576f-9d78-4a98-aed1-efd8715ed214"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.142894 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-config-data" (OuterVolumeSpecName: "config-data") pod "82d7576f-9d78-4a98-aed1-efd8715ed214" (UID: "82d7576f-9d78-4a98-aed1-efd8715ed214"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.205467 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.205511 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxmv4\" (UniqueName: \"kubernetes.io/projected/82d7576f-9d78-4a98-aed1-efd8715ed214-kube-api-access-xxmv4\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.205525 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82d7576f-9d78-4a98-aed1-efd8715ed214-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.881992 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d55baeec-55bc-4176-a082-81433bcb0c42" path="/var/lib/kubelet/pods/d55baeec-55bc-4176-a082-81433bcb0c42/volumes" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.884022 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.908853 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.908845 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"82d7576f-9d78-4a98-aed1-efd8715ed214","Type":"ContainerDied","Data":"4d0505ae852845c706e619dcb5da94c5e0f4521b094aa00f09c646c1b68323e9"} Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.910316 4717 scope.go:117] "RemoveContainer" containerID="3dcdbf3bf174ebf44a382f766610e5d4b392608179a4bc204330193c0bc02b50" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.934365 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c62351c-f8eb-4014-8391-229d81411849","Type":"ContainerStarted","Data":"71aad3a0efb15940070612a974ecfdea4118229b3ca14887da4385247d82dd8e"} Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.934847 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c62351c-f8eb-4014-8391-229d81411849","Type":"ContainerStarted","Data":"8ca4dc83034d99888df0c7d76f3a5c91d6740692e616a51a81829d1f2550e2e5"} Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.934858 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0c62351c-f8eb-4014-8391-229d81411849","Type":"ContainerStarted","Data":"4d7ebce9655ab09aca76760ba40b874757c0c6b959396058f8359d555372c4c3"} Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.943516 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"033365db-a07d-4e41-a0bc-d1781bd8412d","Type":"ContainerDied","Data":"072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae"} Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.943542 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.943454 4717 generic.go:334] "Generic (PLEG): container finished" podID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerID="072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae" exitCode=0 Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.943568 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"033365db-a07d-4e41-a0bc-d1781bd8412d","Type":"ContainerDied","Data":"459214c524078b651bf2fd4205e1368b20f7ea046b3435c7a35a096603975484"} Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.962506 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.96249195 podStartE2EDuration="1.96249195s" podCreationTimestamp="2025-10-07 14:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:15:42.961439261 +0000 UTC m=+1324.789365053" watchObservedRunningTime="2025-10-07 14:15:42.96249195 +0000 UTC m=+1324.790417742" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.964525 4717 scope.go:117] "RemoveContainer" containerID="072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae" Oct 07 14:15:42 crc kubenswrapper[4717]: I1007 14:15:42.984974 4717 scope.go:117] "RemoveContainer" containerID="7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.004051 4717 scope.go:117] "RemoveContainer" containerID="072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae" Oct 07 14:15:43 crc kubenswrapper[4717]: E1007 14:15:43.004410 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae\": container with ID starting with 072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae not found: ID does not exist" containerID="072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.004459 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae"} err="failed to get container status \"072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae\": rpc error: code = NotFound desc = could not find container \"072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae\": container with ID starting with 072d45ef8ecc6412378fc2431acb94d3e98c22139f29cec83cd1ed1dac9b9aae not found: ID does not exist" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.004480 4717 scope.go:117] "RemoveContainer" containerID="7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5" Oct 07 14:15:43 crc kubenswrapper[4717]: E1007 14:15:43.004780 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5\": container with ID starting with 7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5 not found: ID does not exist" containerID="7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.004824 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5"} err="failed to get container status \"7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5\": rpc error: code = NotFound desc = could not find container \"7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5\": container with ID starting with 7329b0055356f6d9b53c56e6acafd797fe5f12e3d2558419810e364a25693dd5 not found: ID does not exist" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.028659 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-config-data\") pod \"033365db-a07d-4e41-a0bc-d1781bd8412d\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.028714 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-internal-tls-certs\") pod \"033365db-a07d-4e41-a0bc-d1781bd8412d\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.028779 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-public-tls-certs\") pod \"033365db-a07d-4e41-a0bc-d1781bd8412d\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.028807 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/033365db-a07d-4e41-a0bc-d1781bd8412d-logs\") pod \"033365db-a07d-4e41-a0bc-d1781bd8412d\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.028830 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-combined-ca-bundle\") pod \"033365db-a07d-4e41-a0bc-d1781bd8412d\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.028862 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfxcs\" (UniqueName: \"kubernetes.io/projected/033365db-a07d-4e41-a0bc-d1781bd8412d-kube-api-access-bfxcs\") pod \"033365db-a07d-4e41-a0bc-d1781bd8412d\" (UID: \"033365db-a07d-4e41-a0bc-d1781bd8412d\") " Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.029613 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/033365db-a07d-4e41-a0bc-d1781bd8412d-logs" (OuterVolumeSpecName: "logs") pod "033365db-a07d-4e41-a0bc-d1781bd8412d" (UID: "033365db-a07d-4e41-a0bc-d1781bd8412d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.034039 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033365db-a07d-4e41-a0bc-d1781bd8412d-kube-api-access-bfxcs" (OuterVolumeSpecName: "kube-api-access-bfxcs") pod "033365db-a07d-4e41-a0bc-d1781bd8412d" (UID: "033365db-a07d-4e41-a0bc-d1781bd8412d"). InnerVolumeSpecName "kube-api-access-bfxcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.060179 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "033365db-a07d-4e41-a0bc-d1781bd8412d" (UID: "033365db-a07d-4e41-a0bc-d1781bd8412d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.061938 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-config-data" (OuterVolumeSpecName: "config-data") pod "033365db-a07d-4e41-a0bc-d1781bd8412d" (UID: "033365db-a07d-4e41-a0bc-d1781bd8412d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.083387 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "033365db-a07d-4e41-a0bc-d1781bd8412d" (UID: "033365db-a07d-4e41-a0bc-d1781bd8412d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.085439 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "033365db-a07d-4e41-a0bc-d1781bd8412d" (UID: "033365db-a07d-4e41-a0bc-d1781bd8412d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.130785 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.130845 4717 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.130857 4717 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.130867 4717 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/033365db-a07d-4e41-a0bc-d1781bd8412d-logs\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.130897 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033365db-a07d-4e41-a0bc-d1781bd8412d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.130905 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfxcs\" (UniqueName: \"kubernetes.io/projected/033365db-a07d-4e41-a0bc-d1781bd8412d-kube-api-access-bfxcs\") on node \"crc\" DevicePath \"\"" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.289450 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.308533 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.319338 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:43 crc kubenswrapper[4717]: E1007 14:15:43.335460 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d7576f-9d78-4a98-aed1-efd8715ed214" containerName="nova-scheduler-scheduler" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.335728 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d7576f-9d78-4a98-aed1-efd8715ed214" containerName="nova-scheduler-scheduler" Oct 07 14:15:43 crc kubenswrapper[4717]: E1007 14:15:43.335825 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerName="nova-api-log" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.335902 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerName="nova-api-log" Oct 07 14:15:43 crc kubenswrapper[4717]: E1007 14:15:43.335994 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerName="nova-api-api" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.336093 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerName="nova-api-api" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.336494 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerName="nova-api-api" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.336602 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" containerName="nova-api-log" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.336687 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d7576f-9d78-4a98-aed1-efd8715ed214" containerName="nova-scheduler-scheduler" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.338140 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.338820 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.340424 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.340582 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.341002 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.436523 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-config-data\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.436584 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-public-tls-certs\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.436614 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-internal-tls-certs\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.436704 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-742bg\" (UniqueName: \"kubernetes.io/projected/975022f5-b6f2-4d3f-adbb-f4b878cd2758-kube-api-access-742bg\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.436795 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.436848 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975022f5-b6f2-4d3f-adbb-f4b878cd2758-logs\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.538425 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.538732 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975022f5-b6f2-4d3f-adbb-f4b878cd2758-logs\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.538865 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-config-data\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.538983 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-public-tls-certs\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.539183 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-internal-tls-certs\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.539278 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-742bg\" (UniqueName: \"kubernetes.io/projected/975022f5-b6f2-4d3f-adbb-f4b878cd2758-kube-api-access-742bg\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.540057 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975022f5-b6f2-4d3f-adbb-f4b878cd2758-logs\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.542753 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.544545 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-config-data\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.545540 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-public-tls-certs\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.548118 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/975022f5-b6f2-4d3f-adbb-f4b878cd2758-internal-tls-certs\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.562194 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-742bg\" (UniqueName: \"kubernetes.io/projected/975022f5-b6f2-4d3f-adbb-f4b878cd2758-kube-api-access-742bg\") pod \"nova-api-0\" (UID: \"975022f5-b6f2-4d3f-adbb-f4b878cd2758\") " pod="openstack/nova-api-0" Oct 07 14:15:43 crc kubenswrapper[4717]: I1007 14:15:43.662840 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 07 14:15:44 crc kubenswrapper[4717]: I1007 14:15:44.118539 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 07 14:15:44 crc kubenswrapper[4717]: W1007 14:15:44.120967 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod975022f5_b6f2_4d3f_adbb_f4b878cd2758.slice/crio-8a8c53581b6e75fe1645f3ce33fa472f2b6304c0d45eae454cde41bfe7288b41 WatchSource:0}: Error finding container 8a8c53581b6e75fe1645f3ce33fa472f2b6304c0d45eae454cde41bfe7288b41: Status 404 returned error can't find the container with id 8a8c53581b6e75fe1645f3ce33fa472f2b6304c0d45eae454cde41bfe7288b41 Oct 07 14:15:44 crc kubenswrapper[4717]: I1007 14:15:44.888298 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033365db-a07d-4e41-a0bc-d1781bd8412d" path="/var/lib/kubelet/pods/033365db-a07d-4e41-a0bc-d1781bd8412d/volumes" Oct 07 14:15:44 crc kubenswrapper[4717]: I1007 14:15:44.965912 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"975022f5-b6f2-4d3f-adbb-f4b878cd2758","Type":"ContainerStarted","Data":"e5153a197972a5a4427b205041353c479fc470f42bb16a071e4e0727669f675c"} Oct 07 14:15:44 crc kubenswrapper[4717]: I1007 14:15:44.965956 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"975022f5-b6f2-4d3f-adbb-f4b878cd2758","Type":"ContainerStarted","Data":"9a85b8e2a38064e1ff25e7519e5af3d0a386b95e0e1deff0e499aa5897bee3ea"} Oct 07 14:15:44 crc kubenswrapper[4717]: I1007 14:15:44.965978 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"975022f5-b6f2-4d3f-adbb-f4b878cd2758","Type":"ContainerStarted","Data":"8a8c53581b6e75fe1645f3ce33fa472f2b6304c0d45eae454cde41bfe7288b41"} Oct 07 14:15:44 crc kubenswrapper[4717]: I1007 14:15:44.988240 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.988215207 podStartE2EDuration="1.988215207s" podCreationTimestamp="2025-10-07 14:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:15:44.982942482 +0000 UTC m=+1326.810868274" watchObservedRunningTime="2025-10-07 14:15:44.988215207 +0000 UTC m=+1326.816141009" Oct 07 14:15:46 crc kubenswrapper[4717]: I1007 14:15:46.554062 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 14:15:46 crc kubenswrapper[4717]: I1007 14:15:46.554425 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 07 14:15:51 crc kubenswrapper[4717]: I1007 14:15:51.554817 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 14:15:51 crc kubenswrapper[4717]: I1007 14:15:51.555383 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 07 14:15:52 crc kubenswrapper[4717]: I1007 14:15:52.569433 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0c62351c-f8eb-4014-8391-229d81411849" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.218:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 14:15:52 crc kubenswrapper[4717]: I1007 14:15:52.569396 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0c62351c-f8eb-4014-8391-229d81411849" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.218:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 14:15:53 crc kubenswrapper[4717]: I1007 14:15:53.359951 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 07 14:15:53 crc kubenswrapper[4717]: I1007 14:15:53.663617 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 14:15:53 crc kubenswrapper[4717]: I1007 14:15:53.663681 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 07 14:15:54 crc kubenswrapper[4717]: I1007 14:15:54.677269 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="975022f5-b6f2-4d3f-adbb-f4b878cd2758" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 14:15:54 crc kubenswrapper[4717]: I1007 14:15:54.677317 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="975022f5-b6f2-4d3f-adbb-f4b878cd2758" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.219:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 07 14:16:01 crc kubenswrapper[4717]: I1007 14:16:01.560588 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 14:16:01 crc kubenswrapper[4717]: I1007 14:16:01.561150 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 07 14:16:01 crc kubenswrapper[4717]: I1007 14:16:01.566590 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 14:16:01 crc kubenswrapper[4717]: I1007 14:16:01.567105 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 07 14:16:01 crc kubenswrapper[4717]: I1007 14:16:01.610208 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:16:01 crc kubenswrapper[4717]: I1007 14:16:01.610271 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:16:01 crc kubenswrapper[4717]: I1007 14:16:01.610320 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:16:01 crc kubenswrapper[4717]: I1007 14:16:01.611149 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80da36f335297e73db206108880448f662f17a9582de4598a70a5b6e5e4985c0"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:16:01 crc kubenswrapper[4717]: I1007 14:16:01.611219 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://80da36f335297e73db206108880448f662f17a9582de4598a70a5b6e5e4985c0" gracePeriod=600 Oct 07 14:16:02 crc kubenswrapper[4717]: I1007 14:16:02.122680 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="80da36f335297e73db206108880448f662f17a9582de4598a70a5b6e5e4985c0" exitCode=0 Oct 07 14:16:02 crc kubenswrapper[4717]: I1007 14:16:02.122753 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"80da36f335297e73db206108880448f662f17a9582de4598a70a5b6e5e4985c0"} Oct 07 14:16:02 crc kubenswrapper[4717]: I1007 14:16:02.123235 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86"} Oct 07 14:16:02 crc kubenswrapper[4717]: I1007 14:16:02.123266 4717 scope.go:117] "RemoveContainer" containerID="3d269e921064d5e2c67af781cbbaee93e5f7c52bd89888c96165e3264b80d4ba" Oct 07 14:16:03 crc kubenswrapper[4717]: I1007 14:16:03.670807 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 14:16:03 crc kubenswrapper[4717]: I1007 14:16:03.671347 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 07 14:16:03 crc kubenswrapper[4717]: I1007 14:16:03.671971 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 14:16:03 crc kubenswrapper[4717]: I1007 14:16:03.672003 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 07 14:16:03 crc kubenswrapper[4717]: I1007 14:16:03.677488 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 14:16:03 crc kubenswrapper[4717]: I1007 14:16:03.678253 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 07 14:16:12 crc kubenswrapper[4717]: I1007 14:16:12.929246 4717 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod82d7576f-9d78-4a98-aed1-efd8715ed214"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod82d7576f-9d78-4a98-aed1-efd8715ed214] : Timed out while waiting for systemd to remove kubepods-besteffort-pod82d7576f_9d78_4a98_aed1_efd8715ed214.slice" Oct 07 14:16:12 crc kubenswrapper[4717]: E1007 14:16:12.929779 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod82d7576f-9d78-4a98-aed1-efd8715ed214] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod82d7576f-9d78-4a98-aed1-efd8715ed214] : Timed out while waiting for systemd to remove kubepods-besteffort-pod82d7576f_9d78_4a98_aed1_efd8715ed214.slice" pod="openstack/nova-scheduler-0" podUID="82d7576f-9d78-4a98-aed1-efd8715ed214" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.228140 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.253768 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.263086 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.276245 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.277544 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.280343 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.292044 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.327503 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztds4\" (UniqueName: \"kubernetes.io/projected/beebd3d0-0aca-4953-9f04-ea98632ae71b-kube-api-access-ztds4\") pod \"nova-scheduler-0\" (UID: \"beebd3d0-0aca-4953-9f04-ea98632ae71b\") " pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.327605 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beebd3d0-0aca-4953-9f04-ea98632ae71b-config-data\") pod \"nova-scheduler-0\" (UID: \"beebd3d0-0aca-4953-9f04-ea98632ae71b\") " pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.327648 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beebd3d0-0aca-4953-9f04-ea98632ae71b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"beebd3d0-0aca-4953-9f04-ea98632ae71b\") " pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.429458 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztds4\" (UniqueName: \"kubernetes.io/projected/beebd3d0-0aca-4953-9f04-ea98632ae71b-kube-api-access-ztds4\") pod \"nova-scheduler-0\" (UID: \"beebd3d0-0aca-4953-9f04-ea98632ae71b\") " pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.429547 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beebd3d0-0aca-4953-9f04-ea98632ae71b-config-data\") pod \"nova-scheduler-0\" (UID: \"beebd3d0-0aca-4953-9f04-ea98632ae71b\") " pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.429600 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beebd3d0-0aca-4953-9f04-ea98632ae71b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"beebd3d0-0aca-4953-9f04-ea98632ae71b\") " pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.435586 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/beebd3d0-0aca-4953-9f04-ea98632ae71b-config-data\") pod \"nova-scheduler-0\" (UID: \"beebd3d0-0aca-4953-9f04-ea98632ae71b\") " pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.436094 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/beebd3d0-0aca-4953-9f04-ea98632ae71b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"beebd3d0-0aca-4953-9f04-ea98632ae71b\") " pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.447336 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztds4\" (UniqueName: \"kubernetes.io/projected/beebd3d0-0aca-4953-9f04-ea98632ae71b-kube-api-access-ztds4\") pod \"nova-scheduler-0\" (UID: \"beebd3d0-0aca-4953-9f04-ea98632ae71b\") " pod="openstack/nova-scheduler-0" Oct 07 14:16:13 crc kubenswrapper[4717]: I1007 14:16:13.609121 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 07 14:16:14 crc kubenswrapper[4717]: I1007 14:16:14.027462 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 07 14:16:14 crc kubenswrapper[4717]: I1007 14:16:14.247401 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"beebd3d0-0aca-4953-9f04-ea98632ae71b","Type":"ContainerStarted","Data":"4c5d5ae713f7fb7cb4778680356ceb06f500b1afbe96153e0b93db88b3a5e8ff"} Oct 07 14:16:14 crc kubenswrapper[4717]: I1007 14:16:14.248109 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"beebd3d0-0aca-4953-9f04-ea98632ae71b","Type":"ContainerStarted","Data":"40cb9558806deb01a2700de6ed46b1aae2c3cfcff6cbc1aafa7923586cab7506"} Oct 07 14:16:14 crc kubenswrapper[4717]: I1007 14:16:14.879420 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d7576f-9d78-4a98-aed1-efd8715ed214" path="/var/lib/kubelet/pods/82d7576f-9d78-4a98-aed1-efd8715ed214/volumes" Oct 07 14:16:18 crc kubenswrapper[4717]: I1007 14:16:18.609771 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 07 14:16:23 crc kubenswrapper[4717]: I1007 14:16:23.610147 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 07 14:16:23 crc kubenswrapper[4717]: I1007 14:16:23.635589 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 07 14:16:23 crc kubenswrapper[4717]: I1007 14:16:23.658528 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=10.658509166 podStartE2EDuration="10.658509166s" podCreationTimestamp="2025-10-07 14:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:16:14.266209725 +0000 UTC m=+1356.094135527" watchObservedRunningTime="2025-10-07 14:16:23.658509166 +0000 UTC m=+1365.486434958" Oct 07 14:16:24 crc kubenswrapper[4717]: I1007 14:16:24.392036 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 07 14:16:33 crc kubenswrapper[4717]: I1007 14:16:33.635363 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 14:16:34 crc kubenswrapper[4717]: I1007 14:16:34.572863 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 14:16:37 crc kubenswrapper[4717]: I1007 14:16:37.713113 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" containerName="rabbitmq" containerID="cri-o://1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e" gracePeriod=604796 Oct 07 14:16:38 crc kubenswrapper[4717]: I1007 14:16:38.526027 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="29b16c62-141d-4bf5-ba4c-79590bdd39cd" containerName="rabbitmq" containerID="cri-o://719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89" gracePeriod=604797 Oct 07 14:16:43 crc kubenswrapper[4717]: I1007 14:16:43.778213 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.141698 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="29b16c62-141d-4bf5-ba4c-79590bdd39cd" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.304271 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452447 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-plugins\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452516 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsxqm\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-kube-api-access-hsxqm\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452592 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-pod-info\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452609 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-server-conf\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452681 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-erlang-cookie\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452737 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-confd\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452772 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452827 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-tls\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452845 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-erlang-cookie-secret\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452884 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-plugins-conf\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.452923 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-config-data\") pod \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\" (UID: \"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2\") " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.454253 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.458508 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.460753 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-kube-api-access-hsxqm" (OuterVolumeSpecName: "kube-api-access-hsxqm") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "kube-api-access-hsxqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.460876 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.475355 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-pod-info" (OuterVolumeSpecName: "pod-info") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.478160 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.482196 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.524480 4717 generic.go:334] "Generic (PLEG): container finished" podID="82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" containerID="1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e" exitCode=0 Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.524529 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2","Type":"ContainerDied","Data":"1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e"} Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.524693 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"82faaf9f-edd1-4ea3-85f9-8b359fbd99a2","Type":"ContainerDied","Data":"fef88196478f534017c6ef0d5bfc8001904b391847f0c33e2ceff9f20fc47ae8"} Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.524714 4717 scope.go:117] "RemoveContainer" containerID="1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.524861 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.550851 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.558325 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.558367 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.558378 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.558390 4717 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.558398 4717 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.558408 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.558416 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsxqm\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-kube-api-access-hsxqm\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.558424 4717 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.558780 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-config-data" (OuterVolumeSpecName: "config-data") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.595815 4717 scope.go:117] "RemoveContainer" containerID="08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.614864 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.628614 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-server-conf" (OuterVolumeSpecName: "server-conf") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.660712 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.660927 4717 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.660996 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.665568 4717 scope.go:117] "RemoveContainer" containerID="1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e" Oct 07 14:16:44 crc kubenswrapper[4717]: E1007 14:16:44.666045 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e\": container with ID starting with 1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e not found: ID does not exist" containerID="1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.666139 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e"} err="failed to get container status \"1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e\": rpc error: code = NotFound desc = could not find container \"1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e\": container with ID starting with 1a6a03e87208757ddb80863b4685daa9a4957054f7cd4881513cc85aa36d8a4e not found: ID does not exist" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.666215 4717 scope.go:117] "RemoveContainer" containerID="08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b" Oct 07 14:16:44 crc kubenswrapper[4717]: E1007 14:16:44.667161 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b\": container with ID starting with 08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b not found: ID does not exist" containerID="08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.667184 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b"} err="failed to get container status \"08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b\": rpc error: code = NotFound desc = could not find container \"08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b\": container with ID starting with 08f66ed4778fb434af0b7625e63230d02447bd517e237968691d4cffeed5884b not found: ID does not exist" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.668173 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" (UID: "82faaf9f-edd1-4ea3-85f9-8b359fbd99a2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.762494 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.863334 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.887145 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.897461 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 14:16:44 crc kubenswrapper[4717]: E1007 14:16:44.897873 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" containerName="rabbitmq" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.897892 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" containerName="rabbitmq" Oct 07 14:16:44 crc kubenswrapper[4717]: E1007 14:16:44.897926 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" containerName="setup-container" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.897932 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" containerName="setup-container" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.898180 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" containerName="rabbitmq" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.899686 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.902872 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.903172 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.903338 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.903612 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tlzgm" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.903764 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.903893 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.904400 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 07 14:16:44 crc kubenswrapper[4717]: I1007 14:16:44.915588 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.064768 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.068962 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.069022 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.069084 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.069106 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6ac461e-73e2-4268-8a00-6faee58bae2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.069121 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6ac461e-73e2-4268-8a00-6faee58bae2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.069158 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.069177 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6ac461e-73e2-4268-8a00-6faee58bae2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.069232 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6ac461e-73e2-4268-8a00-6faee58bae2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.069269 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.069349 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6ac461e-73e2-4268-8a00-6faee58bae2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.069391 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bvbc\" (UniqueName: \"kubernetes.io/projected/d6ac461e-73e2-4268-8a00-6faee58bae2b-kube-api-access-8bvbc\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.170666 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-tls\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.170817 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.170842 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-config-data\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.170865 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-plugins\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.170922 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-server-conf\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.170957 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-plugins-conf\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171063 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czm9t\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-kube-api-access-czm9t\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171126 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/29b16c62-141d-4bf5-ba4c-79590bdd39cd-erlang-cookie-secret\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171162 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/29b16c62-141d-4bf5-ba4c-79590bdd39cd-pod-info\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171204 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-erlang-cookie\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171232 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-confd\") pod \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\" (UID: \"29b16c62-141d-4bf5-ba4c-79590bdd39cd\") " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171312 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171480 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171565 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6ac461e-73e2-4268-8a00-6faee58bae2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171590 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bvbc\" (UniqueName: \"kubernetes.io/projected/d6ac461e-73e2-4268-8a00-6faee58bae2b-kube-api-access-8bvbc\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171630 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171654 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171697 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171719 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6ac461e-73e2-4268-8a00-6faee58bae2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171734 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6ac461e-73e2-4268-8a00-6faee58bae2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171777 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171813 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6ac461e-73e2-4268-8a00-6faee58bae2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171850 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171866 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6ac461e-73e2-4268-8a00-6faee58bae2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.171937 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.172132 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.172564 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.176704 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6ac461e-73e2-4268-8a00-6faee58bae2b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.177782 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6ac461e-73e2-4268-8a00-6faee58bae2b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.178434 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.178630 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.178907 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6ac461e-73e2-4268-8a00-6faee58bae2b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.179146 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d6ac461e-73e2-4268-8a00-6faee58bae2b-config-data\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.180575 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.180643 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-kube-api-access-czm9t" (OuterVolumeSpecName: "kube-api-access-czm9t") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "kube-api-access-czm9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.180731 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.180803 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/29b16c62-141d-4bf5-ba4c-79590bdd39cd-pod-info" (OuterVolumeSpecName: "pod-info") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.181729 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.184297 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29b16c62-141d-4bf5-ba4c-79590bdd39cd-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.186550 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6ac461e-73e2-4268-8a00-6faee58bae2b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.188351 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bvbc\" (UniqueName: \"kubernetes.io/projected/d6ac461e-73e2-4268-8a00-6faee58bae2b-kube-api-access-8bvbc\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.190839 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6ac461e-73e2-4268-8a00-6faee58bae2b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.222457 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-config-data" (OuterVolumeSpecName: "config-data") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.249627 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"d6ac461e-73e2-4268-8a00-6faee58bae2b\") " pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.274667 4717 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.274719 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czm9t\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-kube-api-access-czm9t\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.274735 4717 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/29b16c62-141d-4bf5-ba4c-79590bdd39cd-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.274747 4717 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/29b16c62-141d-4bf5-ba4c-79590bdd39cd-pod-info\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.274759 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.274770 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.274795 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.274808 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.276277 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-server-conf" (OuterVolumeSpecName: "server-conf") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.300054 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.335286 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "29b16c62-141d-4bf5-ba4c-79590bdd39cd" (UID: "29b16c62-141d-4bf5-ba4c-79590bdd39cd"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.376392 4717 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/29b16c62-141d-4bf5-ba4c-79590bdd39cd-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.376434 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.376445 4717 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/29b16c62-141d-4bf5-ba4c-79590bdd39cd-server-conf\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.394714 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z7cwh"] Oct 07 14:16:45 crc kubenswrapper[4717]: E1007 14:16:45.396367 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b16c62-141d-4bf5-ba4c-79590bdd39cd" containerName="rabbitmq" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.396394 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b16c62-141d-4bf5-ba4c-79590bdd39cd" containerName="rabbitmq" Oct 07 14:16:45 crc kubenswrapper[4717]: E1007 14:16:45.396410 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b16c62-141d-4bf5-ba4c-79590bdd39cd" containerName="setup-container" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.396417 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b16c62-141d-4bf5-ba4c-79590bdd39cd" containerName="setup-container" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.396646 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b16c62-141d-4bf5-ba4c-79590bdd39cd" containerName="rabbitmq" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.399933 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.408488 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7cwh"] Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.523093 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.539965 4717 generic.go:334] "Generic (PLEG): container finished" podID="29b16c62-141d-4bf5-ba4c-79590bdd39cd" containerID="719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89" exitCode=0 Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.540038 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"29b16c62-141d-4bf5-ba4c-79590bdd39cd","Type":"ContainerDied","Data":"719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89"} Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.540066 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"29b16c62-141d-4bf5-ba4c-79590bdd39cd","Type":"ContainerDied","Data":"7846d39f651bd4779ca7d799ac96ddf741a592cf173d69a34928ba0c596ed704"} Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.540083 4717 scope.go:117] "RemoveContainer" containerID="719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.540214 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.579577 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-utilities\") pod \"redhat-operators-z7cwh\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.579670 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckw9h\" (UniqueName: \"kubernetes.io/projected/4df13739-727c-43f8-9c6b-956f98395ab8-kube-api-access-ckw9h\") pod \"redhat-operators-z7cwh\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.579715 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-catalog-content\") pod \"redhat-operators-z7cwh\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.582552 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.585336 4717 scope.go:117] "RemoveContainer" containerID="610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.609362 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.625772 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.627635 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.635390 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.635424 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.635624 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.635646 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.635768 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.635841 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.641628 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-w2pxm" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.644300 4717 scope.go:117] "RemoveContainer" containerID="719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.645538 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 14:16:45 crc kubenswrapper[4717]: E1007 14:16:45.646744 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89\": container with ID starting with 719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89 not found: ID does not exist" containerID="719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.646797 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89"} err="failed to get container status \"719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89\": rpc error: code = NotFound desc = could not find container \"719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89\": container with ID starting with 719b8fc288f15f2b5f6023a35f9e61a30122597e3964fc043da4a75d3bca1f89 not found: ID does not exist" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.646830 4717 scope.go:117] "RemoveContainer" containerID="610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3" Oct 07 14:16:45 crc kubenswrapper[4717]: E1007 14:16:45.647498 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3\": container with ID starting with 610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3 not found: ID does not exist" containerID="610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.647541 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3"} err="failed to get container status \"610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3\": rpc error: code = NotFound desc = could not find container \"610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3\": container with ID starting with 610aaaa6a273e264f637d551251afedf2604cf6e085bb75dfd17980051a039b3 not found: ID does not exist" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.681451 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-utilities\") pod \"redhat-operators-z7cwh\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.681729 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckw9h\" (UniqueName: \"kubernetes.io/projected/4df13739-727c-43f8-9c6b-956f98395ab8-kube-api-access-ckw9h\") pod \"redhat-operators-z7cwh\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.681798 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-catalog-content\") pod \"redhat-operators-z7cwh\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.682534 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-utilities\") pod \"redhat-operators-z7cwh\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.682841 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-catalog-content\") pod \"redhat-operators-z7cwh\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.701142 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckw9h\" (UniqueName: \"kubernetes.io/projected/4df13739-727c-43f8-9c6b-956f98395ab8-kube-api-access-ckw9h\") pod \"redhat-operators-z7cwh\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.717285 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.784490 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shgqc\" (UniqueName: \"kubernetes.io/projected/4472de66-ea08-4251-856b-4cb130e7cf1b-kube-api-access-shgqc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.784874 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4472de66-ea08-4251-856b-4cb130e7cf1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.784936 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4472de66-ea08-4251-856b-4cb130e7cf1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.785032 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.785090 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.785151 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.785227 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4472de66-ea08-4251-856b-4cb130e7cf1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.785302 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.785357 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.785380 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4472de66-ea08-4251-856b-4cb130e7cf1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.785421 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4472de66-ea08-4251-856b-4cb130e7cf1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.887574 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4472de66-ea08-4251-856b-4cb130e7cf1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.887623 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.887650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.887686 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.887724 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4472de66-ea08-4251-856b-4cb130e7cf1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.887782 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.887884 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.887909 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4472de66-ea08-4251-856b-4cb130e7cf1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.887948 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4472de66-ea08-4251-856b-4cb130e7cf1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.887988 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shgqc\" (UniqueName: \"kubernetes.io/projected/4472de66-ea08-4251-856b-4cb130e7cf1b-kube-api-access-shgqc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.888074 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4472de66-ea08-4251-856b-4cb130e7cf1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.889470 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4472de66-ea08-4251-856b-4cb130e7cf1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.889700 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.889916 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.890075 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4472de66-ea08-4251-856b-4cb130e7cf1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.890208 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.891607 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4472de66-ea08-4251-856b-4cb130e7cf1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.892862 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.892962 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4472de66-ea08-4251-856b-4cb130e7cf1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.894891 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4472de66-ea08-4251-856b-4cb130e7cf1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.895637 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4472de66-ea08-4251-856b-4cb130e7cf1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.914786 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shgqc\" (UniqueName: \"kubernetes.io/projected/4472de66-ea08-4251-856b-4cb130e7cf1b-kube-api-access-shgqc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.937128 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4472de66-ea08-4251-856b-4cb130e7cf1b\") " pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:45 crc kubenswrapper[4717]: I1007 14:16:45.957344 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:16:46 crc kubenswrapper[4717]: I1007 14:16:46.076191 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 07 14:16:46 crc kubenswrapper[4717]: I1007 14:16:46.211251 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7cwh"] Oct 07 14:16:46 crc kubenswrapper[4717]: I1007 14:16:46.481420 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 07 14:16:46 crc kubenswrapper[4717]: W1007 14:16:46.482062 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4472de66_ea08_4251_856b_4cb130e7cf1b.slice/crio-c6aa95dce0721828256c99ceffe946cc377c974516f981c917577ed9d6d4ca0a WatchSource:0}: Error finding container c6aa95dce0721828256c99ceffe946cc377c974516f981c917577ed9d6d4ca0a: Status 404 returned error can't find the container with id c6aa95dce0721828256c99ceffe946cc377c974516f981c917577ed9d6d4ca0a Oct 07 14:16:46 crc kubenswrapper[4717]: I1007 14:16:46.559281 4717 generic.go:334] "Generic (PLEG): container finished" podID="4df13739-727c-43f8-9c6b-956f98395ab8" containerID="c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6" exitCode=0 Oct 07 14:16:46 crc kubenswrapper[4717]: I1007 14:16:46.559336 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7cwh" event={"ID":"4df13739-727c-43f8-9c6b-956f98395ab8","Type":"ContainerDied","Data":"c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6"} Oct 07 14:16:46 crc kubenswrapper[4717]: I1007 14:16:46.559382 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7cwh" event={"ID":"4df13739-727c-43f8-9c6b-956f98395ab8","Type":"ContainerStarted","Data":"7b2f398778a27703aea5317608bf1a751e50c473e893cbfba09eefef8b7e0241"} Oct 07 14:16:46 crc kubenswrapper[4717]: I1007 14:16:46.561229 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6ac461e-73e2-4268-8a00-6faee58bae2b","Type":"ContainerStarted","Data":"c55feb323d41fe1ce8ae22614a6783ade6bec3fbfeeab6d259d8bd0f95c624df"} Oct 07 14:16:46 crc kubenswrapper[4717]: I1007 14:16:46.564363 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4472de66-ea08-4251-856b-4cb130e7cf1b","Type":"ContainerStarted","Data":"c6aa95dce0721828256c99ceffe946cc377c974516f981c917577ed9d6d4ca0a"} Oct 07 14:16:46 crc kubenswrapper[4717]: I1007 14:16:46.880084 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b16c62-141d-4bf5-ba4c-79590bdd39cd" path="/var/lib/kubelet/pods/29b16c62-141d-4bf5-ba4c-79590bdd39cd/volumes" Oct 07 14:16:46 crc kubenswrapper[4717]: I1007 14:16:46.881619 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82faaf9f-edd1-4ea3-85f9-8b359fbd99a2" path="/var/lib/kubelet/pods/82faaf9f-edd1-4ea3-85f9-8b359fbd99a2/volumes" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.588561 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6ac461e-73e2-4268-8a00-6faee58bae2b","Type":"ContainerStarted","Data":"2eed2682da9f51e17fd904c32f636821d1abbb35590ca45054536351ada7acef"} Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.590406 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4472de66-ea08-4251-856b-4cb130e7cf1b","Type":"ContainerStarted","Data":"8cb58ca6de240b7b213adc59c9f0428b6b59eae548eaf19065f3181b590fc423"} Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.595698 4717 generic.go:334] "Generic (PLEG): container finished" podID="4df13739-727c-43f8-9c6b-956f98395ab8" containerID="df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c" exitCode=0 Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.595755 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7cwh" event={"ID":"4df13739-727c-43f8-9c6b-956f98395ab8","Type":"ContainerDied","Data":"df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c"} Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.680843 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-759799d765-j8g55"] Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.682423 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.686856 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.705069 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759799d765-j8g55"] Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.849929 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-swift-storage-0\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.850052 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-svc\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.850128 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkj9w\" (UniqueName: \"kubernetes.io/projected/b5b23020-1c9f-42c1-9b64-4e55b117add0-kube-api-access-vkj9w\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.850343 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-config\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.850396 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-nb\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.850489 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-openstack-edpm-ipam\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.850575 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-sb\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.952980 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-svc\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.953161 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkj9w\" (UniqueName: \"kubernetes.io/projected/b5b23020-1c9f-42c1-9b64-4e55b117add0-kube-api-access-vkj9w\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.953224 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-config\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.953246 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-nb\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.953314 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-openstack-edpm-ipam\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.953397 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-sb\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.953435 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-swift-storage-0\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.954546 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-openstack-edpm-ipam\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.954554 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-svc\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.955509 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-swift-storage-0\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.955630 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-config\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.955717 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-sb\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.955820 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-nb\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:48 crc kubenswrapper[4717]: I1007 14:16:48.978243 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkj9w\" (UniqueName: \"kubernetes.io/projected/b5b23020-1c9f-42c1-9b64-4e55b117add0-kube-api-access-vkj9w\") pod \"dnsmasq-dns-759799d765-j8g55\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:49 crc kubenswrapper[4717]: I1007 14:16:49.002993 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:49 crc kubenswrapper[4717]: I1007 14:16:49.513893 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-759799d765-j8g55"] Oct 07 14:16:49 crc kubenswrapper[4717]: I1007 14:16:49.614253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759799d765-j8g55" event={"ID":"b5b23020-1c9f-42c1-9b64-4e55b117add0","Type":"ContainerStarted","Data":"d21961faabd8aaaeacee4f1243f6e72ee11bb4fda5d0ff08847713504850134d"} Oct 07 14:16:50 crc kubenswrapper[4717]: I1007 14:16:50.623601 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7cwh" event={"ID":"4df13739-727c-43f8-9c6b-956f98395ab8","Type":"ContainerStarted","Data":"14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804"} Oct 07 14:16:50 crc kubenswrapper[4717]: I1007 14:16:50.626581 4717 generic.go:334] "Generic (PLEG): container finished" podID="b5b23020-1c9f-42c1-9b64-4e55b117add0" containerID="90553621d7bf2b41a9a1fa9a4e3f8cb602904f7f491b01fda39d9d4f080216dd" exitCode=0 Oct 07 14:16:50 crc kubenswrapper[4717]: I1007 14:16:50.626737 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759799d765-j8g55" event={"ID":"b5b23020-1c9f-42c1-9b64-4e55b117add0","Type":"ContainerDied","Data":"90553621d7bf2b41a9a1fa9a4e3f8cb602904f7f491b01fda39d9d4f080216dd"} Oct 07 14:16:50 crc kubenswrapper[4717]: I1007 14:16:50.653644 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z7cwh" podStartSLOduration=2.894689349 podStartE2EDuration="5.653614384s" podCreationTimestamp="2025-10-07 14:16:45 +0000 UTC" firstStartedPulling="2025-10-07 14:16:46.561333949 +0000 UTC m=+1388.389259741" lastFinishedPulling="2025-10-07 14:16:49.320258984 +0000 UTC m=+1391.148184776" observedRunningTime="2025-10-07 14:16:50.642841448 +0000 UTC m=+1392.470767240" watchObservedRunningTime="2025-10-07 14:16:50.653614384 +0000 UTC m=+1392.481540186" Oct 07 14:16:51 crc kubenswrapper[4717]: I1007 14:16:51.639849 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759799d765-j8g55" event={"ID":"b5b23020-1c9f-42c1-9b64-4e55b117add0","Type":"ContainerStarted","Data":"d8209c6b679b79c26393eb5cf8715ffe411cba5f0434eae43fee01f5d29ca943"} Oct 07 14:16:51 crc kubenswrapper[4717]: I1007 14:16:51.640076 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:51 crc kubenswrapper[4717]: I1007 14:16:51.672938 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-759799d765-j8g55" podStartSLOduration=3.672918266 podStartE2EDuration="3.672918266s" podCreationTimestamp="2025-10-07 14:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:16:51.665314817 +0000 UTC m=+1393.493240609" watchObservedRunningTime="2025-10-07 14:16:51.672918266 +0000 UTC m=+1393.500844058" Oct 07 14:16:55 crc kubenswrapper[4717]: I1007 14:16:55.718510 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:55 crc kubenswrapper[4717]: I1007 14:16:55.719156 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:16:56 crc kubenswrapper[4717]: I1007 14:16:56.775582 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7cwh" podUID="4df13739-727c-43f8-9c6b-956f98395ab8" containerName="registry-server" probeResult="failure" output=< Oct 07 14:16:56 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Oct 07 14:16:56 crc kubenswrapper[4717]: > Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.004862 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.085127 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-hg9q8"] Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.085371 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" podUID="d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" containerName="dnsmasq-dns" containerID="cri-o://165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b" gracePeriod=10 Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.213341 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bb847fbb7-wm5sb"] Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.215095 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.230207 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb847fbb7-wm5sb"] Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.398520 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.398874 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.399023 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwjll\" (UniqueName: \"kubernetes.io/projected/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-kube-api-access-kwjll\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.399307 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-dns-svc\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.399455 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.399605 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-dns-swift-storage-0\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.399674 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-config\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.502350 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.502430 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.502467 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwjll\" (UniqueName: \"kubernetes.io/projected/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-kube-api-access-kwjll\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.502631 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-dns-svc\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.502720 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.502814 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-dns-swift-storage-0\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.502857 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-config\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.503501 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-openstack-edpm-ipam\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.504343 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-dns-svc\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.505039 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-ovsdbserver-sb\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.505498 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-config\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.506254 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-ovsdbserver-nb\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.506493 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-dns-swift-storage-0\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.537227 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwjll\" (UniqueName: \"kubernetes.io/projected/ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6-kube-api-access-kwjll\") pod \"dnsmasq-dns-5bb847fbb7-wm5sb\" (UID: \"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6\") " pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.564088 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.702746 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.787500 4717 generic.go:334] "Generic (PLEG): container finished" podID="d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" containerID="165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b" exitCode=0 Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.787543 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" event={"ID":"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb","Type":"ContainerDied","Data":"165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b"} Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.787573 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" event={"ID":"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb","Type":"ContainerDied","Data":"4a31e9569a2607d2570b9c985eaceb5af05eedcaca2ec94386c47e65413c08c7"} Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.787589 4717 scope.go:117] "RemoveContainer" containerID="165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.787763 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6559f4fbd7-hg9q8" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.810620 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-svc\") pod \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.810729 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-nb\") pod \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.810912 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxpfn\" (UniqueName: \"kubernetes.io/projected/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-kube-api-access-jxpfn\") pod \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.810946 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-swift-storage-0\") pod \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.811002 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-config\") pod \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.811093 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-sb\") pod \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\" (UID: \"d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb\") " Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.824757 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-kube-api-access-jxpfn" (OuterVolumeSpecName: "kube-api-access-jxpfn") pod "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" (UID: "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb"). InnerVolumeSpecName "kube-api-access-jxpfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.913001 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxpfn\" (UniqueName: \"kubernetes.io/projected/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-kube-api-access-jxpfn\") on node \"crc\" DevicePath \"\"" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.969178 4717 scope.go:117] "RemoveContainer" containerID="2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7" Oct 07 14:16:59 crc kubenswrapper[4717]: I1007 14:16:59.984635 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" (UID: "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.012610 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-config" (OuterVolumeSpecName: "config") pod "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" (UID: "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.019641 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.019678 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.028741 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" (UID: "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.039884 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" (UID: "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.042423 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" (UID: "d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.047960 4717 scope.go:117] "RemoveContainer" containerID="165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b" Oct 07 14:17:00 crc kubenswrapper[4717]: E1007 14:17:00.048464 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b\": container with ID starting with 165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b not found: ID does not exist" containerID="165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.048503 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b"} err="failed to get container status \"165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b\": rpc error: code = NotFound desc = could not find container \"165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b\": container with ID starting with 165b438d9eccd278b0ff3947517e3595093d4f7f3a82c55c35b22cd9cbad1e4b not found: ID does not exist" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.048531 4717 scope.go:117] "RemoveContainer" containerID="2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7" Oct 07 14:17:00 crc kubenswrapper[4717]: E1007 14:17:00.048806 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7\": container with ID starting with 2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7 not found: ID does not exist" containerID="2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.048826 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7"} err="failed to get container status \"2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7\": rpc error: code = NotFound desc = could not find container \"2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7\": container with ID starting with 2bea6b1ea39c02a8d0f0f129f2fec48294140b0e68ed4d5f2954321c1c13ebe7 not found: ID does not exist" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.122219 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.122254 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.122264 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.127495 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-hg9q8"] Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.137397 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6559f4fbd7-hg9q8"] Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.213561 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bb847fbb7-wm5sb"] Oct 07 14:17:00 crc kubenswrapper[4717]: W1007 14:17:00.214798 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddfc15fb_6212_46a6_b1d9_f0a2f1f15fd6.slice/crio-c0fdd2049430d91cc160affb0b443cb21cfb4489ff9bbc168de1d7513c78e01d WatchSource:0}: Error finding container c0fdd2049430d91cc160affb0b443cb21cfb4489ff9bbc168de1d7513c78e01d: Status 404 returned error can't find the container with id c0fdd2049430d91cc160affb0b443cb21cfb4489ff9bbc168de1d7513c78e01d Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.798467 4717 generic.go:334] "Generic (PLEG): container finished" podID="ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6" containerID="f1221c517a18b0a856d066aacd0db364ac0f448e85ec1ae9c5bdad02942cb34c" exitCode=0 Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.798585 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" event={"ID":"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6","Type":"ContainerDied","Data":"f1221c517a18b0a856d066aacd0db364ac0f448e85ec1ae9c5bdad02942cb34c"} Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.798809 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" event={"ID":"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6","Type":"ContainerStarted","Data":"c0fdd2049430d91cc160affb0b443cb21cfb4489ff9bbc168de1d7513c78e01d"} Oct 07 14:17:00 crc kubenswrapper[4717]: I1007 14:17:00.880428 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" path="/var/lib/kubelet/pods/d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb/volumes" Oct 07 14:17:01 crc kubenswrapper[4717]: I1007 14:17:01.813517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" event={"ID":"ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6","Type":"ContainerStarted","Data":"e6b35967b5670f4e734b9c24de81d093f212eb5b4868747315520a820cb56da7"} Oct 07 14:17:01 crc kubenswrapper[4717]: I1007 14:17:01.814024 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:17:01 crc kubenswrapper[4717]: I1007 14:17:01.833231 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" podStartSLOduration=2.833213263 podStartE2EDuration="2.833213263s" podCreationTimestamp="2025-10-07 14:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:17:01.830350914 +0000 UTC m=+1403.658276726" watchObservedRunningTime="2025-10-07 14:17:01.833213263 +0000 UTC m=+1403.661139055" Oct 07 14:17:05 crc kubenswrapper[4717]: I1007 14:17:05.767927 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:17:05 crc kubenswrapper[4717]: I1007 14:17:05.821378 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:17:06 crc kubenswrapper[4717]: I1007 14:17:06.006038 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7cwh"] Oct 07 14:17:06 crc kubenswrapper[4717]: I1007 14:17:06.857653 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z7cwh" podUID="4df13739-727c-43f8-9c6b-956f98395ab8" containerName="registry-server" containerID="cri-o://14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804" gracePeriod=2 Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.322327 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.394704 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-catalog-content\") pod \"4df13739-727c-43f8-9c6b-956f98395ab8\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.395000 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckw9h\" (UniqueName: \"kubernetes.io/projected/4df13739-727c-43f8-9c6b-956f98395ab8-kube-api-access-ckw9h\") pod \"4df13739-727c-43f8-9c6b-956f98395ab8\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.395105 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-utilities\") pod \"4df13739-727c-43f8-9c6b-956f98395ab8\" (UID: \"4df13739-727c-43f8-9c6b-956f98395ab8\") " Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.395589 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-utilities" (OuterVolumeSpecName: "utilities") pod "4df13739-727c-43f8-9c6b-956f98395ab8" (UID: "4df13739-727c-43f8-9c6b-956f98395ab8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.400824 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4df13739-727c-43f8-9c6b-956f98395ab8-kube-api-access-ckw9h" (OuterVolumeSpecName: "kube-api-access-ckw9h") pod "4df13739-727c-43f8-9c6b-956f98395ab8" (UID: "4df13739-727c-43f8-9c6b-956f98395ab8"). InnerVolumeSpecName "kube-api-access-ckw9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.467157 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4df13739-727c-43f8-9c6b-956f98395ab8" (UID: "4df13739-727c-43f8-9c6b-956f98395ab8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.497440 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckw9h\" (UniqueName: \"kubernetes.io/projected/4df13739-727c-43f8-9c6b-956f98395ab8-kube-api-access-ckw9h\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.497510 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.497525 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4df13739-727c-43f8-9c6b-956f98395ab8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.868279 4717 generic.go:334] "Generic (PLEG): container finished" podID="4df13739-727c-43f8-9c6b-956f98395ab8" containerID="14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804" exitCode=0 Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.868426 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7cwh" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.868443 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7cwh" event={"ID":"4df13739-727c-43f8-9c6b-956f98395ab8","Type":"ContainerDied","Data":"14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804"} Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.869215 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7cwh" event={"ID":"4df13739-727c-43f8-9c6b-956f98395ab8","Type":"ContainerDied","Data":"7b2f398778a27703aea5317608bf1a751e50c473e893cbfba09eefef8b7e0241"} Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.869244 4717 scope.go:117] "RemoveContainer" containerID="14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.897479 4717 scope.go:117] "RemoveContainer" containerID="df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.904238 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7cwh"] Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.912606 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z7cwh"] Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.921404 4717 scope.go:117] "RemoveContainer" containerID="c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.972413 4717 scope.go:117] "RemoveContainer" containerID="14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804" Oct 07 14:17:07 crc kubenswrapper[4717]: E1007 14:17:07.972831 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804\": container with ID starting with 14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804 not found: ID does not exist" containerID="14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.972873 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804"} err="failed to get container status \"14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804\": rpc error: code = NotFound desc = could not find container \"14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804\": container with ID starting with 14d05d4102f163b76aff6fb64f7643f2c431e8caefe8cc558c3c3ee00225d804 not found: ID does not exist" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.972900 4717 scope.go:117] "RemoveContainer" containerID="df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c" Oct 07 14:17:07 crc kubenswrapper[4717]: E1007 14:17:07.973130 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c\": container with ID starting with df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c not found: ID does not exist" containerID="df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.973154 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c"} err="failed to get container status \"df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c\": rpc error: code = NotFound desc = could not find container \"df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c\": container with ID starting with df620fd3ee09f408054dc3875b51a495fca6a883aed23c952740f2622f51194c not found: ID does not exist" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.973168 4717 scope.go:117] "RemoveContainer" containerID="c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6" Oct 07 14:17:07 crc kubenswrapper[4717]: E1007 14:17:07.973321 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6\": container with ID starting with c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6 not found: ID does not exist" containerID="c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6" Oct 07 14:17:07 crc kubenswrapper[4717]: I1007 14:17:07.973340 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6"} err="failed to get container status \"c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6\": rpc error: code = NotFound desc = could not find container \"c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6\": container with ID starting with c76e170a3a569efd75cfc99fdcd418ceca22cd25b76ab2c6cbb5f3a432b009b6 not found: ID does not exist" Oct 07 14:17:08 crc kubenswrapper[4717]: I1007 14:17:08.880646 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4df13739-727c-43f8-9c6b-956f98395ab8" path="/var/lib/kubelet/pods/4df13739-727c-43f8-9c6b-956f98395ab8/volumes" Oct 07 14:17:09 crc kubenswrapper[4717]: I1007 14:17:09.566148 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bb847fbb7-wm5sb" Oct 07 14:17:09 crc kubenswrapper[4717]: I1007 14:17:09.643971 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759799d765-j8g55"] Oct 07 14:17:09 crc kubenswrapper[4717]: I1007 14:17:09.644274 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-759799d765-j8g55" podUID="b5b23020-1c9f-42c1-9b64-4e55b117add0" containerName="dnsmasq-dns" containerID="cri-o://d8209c6b679b79c26393eb5cf8715ffe411cba5f0434eae43fee01f5d29ca943" gracePeriod=10 Oct 07 14:17:09 crc kubenswrapper[4717]: I1007 14:17:09.901331 4717 generic.go:334] "Generic (PLEG): container finished" podID="b5b23020-1c9f-42c1-9b64-4e55b117add0" containerID="d8209c6b679b79c26393eb5cf8715ffe411cba5f0434eae43fee01f5d29ca943" exitCode=0 Oct 07 14:17:09 crc kubenswrapper[4717]: I1007 14:17:09.901392 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759799d765-j8g55" event={"ID":"b5b23020-1c9f-42c1-9b64-4e55b117add0","Type":"ContainerDied","Data":"d8209c6b679b79c26393eb5cf8715ffe411cba5f0434eae43fee01f5d29ca943"} Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.170239 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.236953 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-config\") pod \"b5b23020-1c9f-42c1-9b64-4e55b117add0\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.237043 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkj9w\" (UniqueName: \"kubernetes.io/projected/b5b23020-1c9f-42c1-9b64-4e55b117add0-kube-api-access-vkj9w\") pod \"b5b23020-1c9f-42c1-9b64-4e55b117add0\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.237144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-sb\") pod \"b5b23020-1c9f-42c1-9b64-4e55b117add0\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.237169 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-svc\") pod \"b5b23020-1c9f-42c1-9b64-4e55b117add0\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.237195 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-swift-storage-0\") pod \"b5b23020-1c9f-42c1-9b64-4e55b117add0\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.237286 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-nb\") pod \"b5b23020-1c9f-42c1-9b64-4e55b117add0\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.237351 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-openstack-edpm-ipam\") pod \"b5b23020-1c9f-42c1-9b64-4e55b117add0\" (UID: \"b5b23020-1c9f-42c1-9b64-4e55b117add0\") " Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.251103 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b23020-1c9f-42c1-9b64-4e55b117add0-kube-api-access-vkj9w" (OuterVolumeSpecName: "kube-api-access-vkj9w") pod "b5b23020-1c9f-42c1-9b64-4e55b117add0" (UID: "b5b23020-1c9f-42c1-9b64-4e55b117add0"). InnerVolumeSpecName "kube-api-access-vkj9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.314995 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5b23020-1c9f-42c1-9b64-4e55b117add0" (UID: "b5b23020-1c9f-42c1-9b64-4e55b117add0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.317348 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5b23020-1c9f-42c1-9b64-4e55b117add0" (UID: "b5b23020-1c9f-42c1-9b64-4e55b117add0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.334862 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5b23020-1c9f-42c1-9b64-4e55b117add0" (UID: "b5b23020-1c9f-42c1-9b64-4e55b117add0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.339554 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkj9w\" (UniqueName: \"kubernetes.io/projected/b5b23020-1c9f-42c1-9b64-4e55b117add0-kube-api-access-vkj9w\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.339620 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.339636 4717 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.339645 4717 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.341367 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-config" (OuterVolumeSpecName: "config") pod "b5b23020-1c9f-42c1-9b64-4e55b117add0" (UID: "b5b23020-1c9f-42c1-9b64-4e55b117add0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.347374 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b5b23020-1c9f-42c1-9b64-4e55b117add0" (UID: "b5b23020-1c9f-42c1-9b64-4e55b117add0"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.351673 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5b23020-1c9f-42c1-9b64-4e55b117add0" (UID: "b5b23020-1c9f-42c1-9b64-4e55b117add0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.440659 4717 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.440698 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.440709 4717 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5b23020-1c9f-42c1-9b64-4e55b117add0-config\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.927721 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-759799d765-j8g55" event={"ID":"b5b23020-1c9f-42c1-9b64-4e55b117add0","Type":"ContainerDied","Data":"d21961faabd8aaaeacee4f1243f6e72ee11bb4fda5d0ff08847713504850134d"} Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.927789 4717 scope.go:117] "RemoveContainer" containerID="d8209c6b679b79c26393eb5cf8715ffe411cba5f0434eae43fee01f5d29ca943" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.927949 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-759799d765-j8g55" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.958190 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-759799d765-j8g55"] Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.961984 4717 scope.go:117] "RemoveContainer" containerID="90553621d7bf2b41a9a1fa9a4e3f8cb602904f7f491b01fda39d9d4f080216dd" Oct 07 14:17:10 crc kubenswrapper[4717]: I1007 14:17:10.969284 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-759799d765-j8g55"] Oct 07 14:17:12 crc kubenswrapper[4717]: I1007 14:17:12.880230 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b23020-1c9f-42c1-9b64-4e55b117add0" path="/var/lib/kubelet/pods/b5b23020-1c9f-42c1-9b64-4e55b117add0/volumes" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.982806 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6fqs6"] Oct 07 14:17:18 crc kubenswrapper[4717]: E1007 14:17:18.985094 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b23020-1c9f-42c1-9b64-4e55b117add0" containerName="dnsmasq-dns" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.985180 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b23020-1c9f-42c1-9b64-4e55b117add0" containerName="dnsmasq-dns" Oct 07 14:17:18 crc kubenswrapper[4717]: E1007 14:17:18.985254 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4df13739-727c-43f8-9c6b-956f98395ab8" containerName="extract-utilities" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.985309 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df13739-727c-43f8-9c6b-956f98395ab8" containerName="extract-utilities" Oct 07 14:17:18 crc kubenswrapper[4717]: E1007 14:17:18.985376 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4df13739-727c-43f8-9c6b-956f98395ab8" containerName="extract-content" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.985431 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df13739-727c-43f8-9c6b-956f98395ab8" containerName="extract-content" Oct 07 14:17:18 crc kubenswrapper[4717]: E1007 14:17:18.985495 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" containerName="init" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.985550 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" containerName="init" Oct 07 14:17:18 crc kubenswrapper[4717]: E1007 14:17:18.985608 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" containerName="dnsmasq-dns" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.985660 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" containerName="dnsmasq-dns" Oct 07 14:17:18 crc kubenswrapper[4717]: E1007 14:17:18.985713 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5b23020-1c9f-42c1-9b64-4e55b117add0" containerName="init" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.985765 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b23020-1c9f-42c1-9b64-4e55b117add0" containerName="init" Oct 07 14:17:18 crc kubenswrapper[4717]: E1007 14:17:18.985855 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4df13739-727c-43f8-9c6b-956f98395ab8" containerName="registry-server" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.985922 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="4df13739-727c-43f8-9c6b-956f98395ab8" containerName="registry-server" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.986202 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5b23020-1c9f-42c1-9b64-4e55b117add0" containerName="dnsmasq-dns" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.986280 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="4df13739-727c-43f8-9c6b-956f98395ab8" containerName="registry-server" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.986348 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8cb0ee1-4fe9-473e-9bcf-b6e1fc2081eb" containerName="dnsmasq-dns" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.987915 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:18 crc kubenswrapper[4717]: I1007 14:17:18.997742 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fqs6"] Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.116232 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-catalog-content\") pod \"certified-operators-6fqs6\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.116358 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-utilities\") pod \"certified-operators-6fqs6\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.116459 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjtw8\" (UniqueName: \"kubernetes.io/projected/cafb033a-7cce-47a0-9adf-73796beb783f-kube-api-access-bjtw8\") pod \"certified-operators-6fqs6\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.219125 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-catalog-content\") pod \"certified-operators-6fqs6\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.219221 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-utilities\") pod \"certified-operators-6fqs6\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.219280 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjtw8\" (UniqueName: \"kubernetes.io/projected/cafb033a-7cce-47a0-9adf-73796beb783f-kube-api-access-bjtw8\") pod \"certified-operators-6fqs6\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.219675 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-catalog-content\") pod \"certified-operators-6fqs6\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.219688 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-utilities\") pod \"certified-operators-6fqs6\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.245087 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjtw8\" (UniqueName: \"kubernetes.io/projected/cafb033a-7cce-47a0-9adf-73796beb783f-kube-api-access-bjtw8\") pod \"certified-operators-6fqs6\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.314759 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:19 crc kubenswrapper[4717]: I1007 14:17:19.848161 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fqs6"] Oct 07 14:17:20 crc kubenswrapper[4717]: I1007 14:17:20.011384 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fqs6" event={"ID":"cafb033a-7cce-47a0-9adf-73796beb783f","Type":"ContainerStarted","Data":"f01ab98107090d2f582813b4c75e9584b97bd8fdb758c5505282278a05df96c8"} Oct 07 14:17:20 crc kubenswrapper[4717]: I1007 14:17:20.017861 4717 generic.go:334] "Generic (PLEG): container finished" podID="d6ac461e-73e2-4268-8a00-6faee58bae2b" containerID="2eed2682da9f51e17fd904c32f636821d1abbb35590ca45054536351ada7acef" exitCode=0 Oct 07 14:17:20 crc kubenswrapper[4717]: I1007 14:17:20.018425 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6ac461e-73e2-4268-8a00-6faee58bae2b","Type":"ContainerDied","Data":"2eed2682da9f51e17fd904c32f636821d1abbb35590ca45054536351ada7acef"} Oct 07 14:17:21 crc kubenswrapper[4717]: I1007 14:17:21.029738 4717 generic.go:334] "Generic (PLEG): container finished" podID="cafb033a-7cce-47a0-9adf-73796beb783f" containerID="a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376" exitCode=0 Oct 07 14:17:21 crc kubenswrapper[4717]: I1007 14:17:21.029842 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fqs6" event={"ID":"cafb033a-7cce-47a0-9adf-73796beb783f","Type":"ContainerDied","Data":"a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376"} Oct 07 14:17:21 crc kubenswrapper[4717]: I1007 14:17:21.032279 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6ac461e-73e2-4268-8a00-6faee58bae2b","Type":"ContainerStarted","Data":"b9801590d74363fa8e5a0ba54921a996f4141f1a803707d58421af921469efa9"} Oct 07 14:17:21 crc kubenswrapper[4717]: I1007 14:17:21.032882 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 07 14:17:21 crc kubenswrapper[4717]: I1007 14:17:21.035505 4717 generic.go:334] "Generic (PLEG): container finished" podID="4472de66-ea08-4251-856b-4cb130e7cf1b" containerID="8cb58ca6de240b7b213adc59c9f0428b6b59eae548eaf19065f3181b590fc423" exitCode=0 Oct 07 14:17:21 crc kubenswrapper[4717]: I1007 14:17:21.035590 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4472de66-ea08-4251-856b-4cb130e7cf1b","Type":"ContainerDied","Data":"8cb58ca6de240b7b213adc59c9f0428b6b59eae548eaf19065f3181b590fc423"} Oct 07 14:17:21 crc kubenswrapper[4717]: I1007 14:17:21.087118 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.087076175 podStartE2EDuration="37.087076175s" podCreationTimestamp="2025-10-07 14:16:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:17:21.080420112 +0000 UTC m=+1422.908345914" watchObservedRunningTime="2025-10-07 14:17:21.087076175 +0000 UTC m=+1422.915001977" Oct 07 14:17:22 crc kubenswrapper[4717]: I1007 14:17:22.046018 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fqs6" event={"ID":"cafb033a-7cce-47a0-9adf-73796beb783f","Type":"ContainerStarted","Data":"6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77"} Oct 07 14:17:22 crc kubenswrapper[4717]: I1007 14:17:22.048935 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4472de66-ea08-4251-856b-4cb130e7cf1b","Type":"ContainerStarted","Data":"27b5a9d425a76811b6ab175e836ae3e5d8cf7d1514483310615de727052a75a4"} Oct 07 14:17:22 crc kubenswrapper[4717]: I1007 14:17:22.147037 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.146982444 podStartE2EDuration="37.146982444s" podCreationTimestamp="2025-10-07 14:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 14:17:22.134899572 +0000 UTC m=+1423.962825374" watchObservedRunningTime="2025-10-07 14:17:22.146982444 +0000 UTC m=+1423.974908246" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.067241 4717 generic.go:334] "Generic (PLEG): container finished" podID="cafb033a-7cce-47a0-9adf-73796beb783f" containerID="6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77" exitCode=0 Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.067477 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx"] Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.070209 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fqs6" event={"ID":"cafb033a-7cce-47a0-9adf-73796beb783f","Type":"ContainerDied","Data":"6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77"} Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.070298 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.074255 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.074412 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.074556 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.074616 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.079459 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx"] Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.097274 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.097360 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmj8n\" (UniqueName: \"kubernetes.io/projected/1da9937b-d5ca-4f21-b803-ef9121b48f23-kube-api-access-lmj8n\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.097424 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.097455 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.199305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.199640 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmj8n\" (UniqueName: \"kubernetes.io/projected/1da9937b-d5ca-4f21-b803-ef9121b48f23-kube-api-access-lmj8n\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.199846 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.199957 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.204929 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.205338 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.206125 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.232840 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmj8n\" (UniqueName: \"kubernetes.io/projected/1da9937b-d5ca-4f21-b803-ef9121b48f23-kube-api-access-lmj8n\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.397757 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:23 crc kubenswrapper[4717]: I1007 14:17:23.998981 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx"] Oct 07 14:17:24 crc kubenswrapper[4717]: W1007 14:17:24.016740 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1da9937b_d5ca_4f21_b803_ef9121b48f23.slice/crio-053bd5fec0afeb8d60fecd4b9c67a9583b97036900d29895183f3a721f87e035 WatchSource:0}: Error finding container 053bd5fec0afeb8d60fecd4b9c67a9583b97036900d29895183f3a721f87e035: Status 404 returned error can't find the container with id 053bd5fec0afeb8d60fecd4b9c67a9583b97036900d29895183f3a721f87e035 Oct 07 14:17:24 crc kubenswrapper[4717]: I1007 14:17:24.079070 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" event={"ID":"1da9937b-d5ca-4f21-b803-ef9121b48f23","Type":"ContainerStarted","Data":"053bd5fec0afeb8d60fecd4b9c67a9583b97036900d29895183f3a721f87e035"} Oct 07 14:17:24 crc kubenswrapper[4717]: I1007 14:17:24.081235 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fqs6" event={"ID":"cafb033a-7cce-47a0-9adf-73796beb783f","Type":"ContainerStarted","Data":"f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6"} Oct 07 14:17:24 crc kubenswrapper[4717]: I1007 14:17:24.104591 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6fqs6" podStartSLOduration=3.33597983 podStartE2EDuration="6.104570041s" podCreationTimestamp="2025-10-07 14:17:18 +0000 UTC" firstStartedPulling="2025-10-07 14:17:21.032608107 +0000 UTC m=+1422.860533899" lastFinishedPulling="2025-10-07 14:17:23.801198318 +0000 UTC m=+1425.629124110" observedRunningTime="2025-10-07 14:17:24.101357823 +0000 UTC m=+1425.929283615" watchObservedRunningTime="2025-10-07 14:17:24.104570041 +0000 UTC m=+1425.932495833" Oct 07 14:17:25 crc kubenswrapper[4717]: I1007 14:17:25.958033 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:17:29 crc kubenswrapper[4717]: I1007 14:17:29.315731 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:29 crc kubenswrapper[4717]: I1007 14:17:29.317559 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:30 crc kubenswrapper[4717]: I1007 14:17:30.366139 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6fqs6" podUID="cafb033a-7cce-47a0-9adf-73796beb783f" containerName="registry-server" probeResult="failure" output=< Oct 07 14:17:30 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Oct 07 14:17:30 crc kubenswrapper[4717]: > Oct 07 14:17:35 crc kubenswrapper[4717]: I1007 14:17:35.527799 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 07 14:17:35 crc kubenswrapper[4717]: I1007 14:17:35.961262 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 07 14:17:36 crc kubenswrapper[4717]: I1007 14:17:36.209092 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" event={"ID":"1da9937b-d5ca-4f21-b803-ef9121b48f23","Type":"ContainerStarted","Data":"dd26921ff2b08880502810fac6841730a680cca4741d8798e1bfce6774061e34"} Oct 07 14:17:36 crc kubenswrapper[4717]: I1007 14:17:36.232194 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" podStartSLOduration=2.2813939579999998 podStartE2EDuration="13.232169753s" podCreationTimestamp="2025-10-07 14:17:23 +0000 UTC" firstStartedPulling="2025-10-07 14:17:24.019929173 +0000 UTC m=+1425.847854965" lastFinishedPulling="2025-10-07 14:17:34.970704968 +0000 UTC m=+1436.798630760" observedRunningTime="2025-10-07 14:17:36.227193286 +0000 UTC m=+1438.055119078" watchObservedRunningTime="2025-10-07 14:17:36.232169753 +0000 UTC m=+1438.060095545" Oct 07 14:17:39 crc kubenswrapper[4717]: I1007 14:17:39.365406 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:39 crc kubenswrapper[4717]: I1007 14:17:39.413979 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:39 crc kubenswrapper[4717]: I1007 14:17:39.600343 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6fqs6"] Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.173110 4717 scope.go:117] "RemoveContainer" containerID="0ff22ed15aa209e6635b16c1e44097438429e5b94e6f6069a757c16e326293d4" Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.252612 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6fqs6" podUID="cafb033a-7cce-47a0-9adf-73796beb783f" containerName="registry-server" containerID="cri-o://f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6" gracePeriod=2 Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.707138 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.791557 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-utilities\") pod \"cafb033a-7cce-47a0-9adf-73796beb783f\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.791632 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-catalog-content\") pod \"cafb033a-7cce-47a0-9adf-73796beb783f\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.791670 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjtw8\" (UniqueName: \"kubernetes.io/projected/cafb033a-7cce-47a0-9adf-73796beb783f-kube-api-access-bjtw8\") pod \"cafb033a-7cce-47a0-9adf-73796beb783f\" (UID: \"cafb033a-7cce-47a0-9adf-73796beb783f\") " Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.792476 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-utilities" (OuterVolumeSpecName: "utilities") pod "cafb033a-7cce-47a0-9adf-73796beb783f" (UID: "cafb033a-7cce-47a0-9adf-73796beb783f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.803130 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cafb033a-7cce-47a0-9adf-73796beb783f-kube-api-access-bjtw8" (OuterVolumeSpecName: "kube-api-access-bjtw8") pod "cafb033a-7cce-47a0-9adf-73796beb783f" (UID: "cafb033a-7cce-47a0-9adf-73796beb783f"). InnerVolumeSpecName "kube-api-access-bjtw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.839000 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cafb033a-7cce-47a0-9adf-73796beb783f" (UID: "cafb033a-7cce-47a0-9adf-73796beb783f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.895212 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.895256 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cafb033a-7cce-47a0-9adf-73796beb783f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:41 crc kubenswrapper[4717]: I1007 14:17:41.895310 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjtw8\" (UniqueName: \"kubernetes.io/projected/cafb033a-7cce-47a0-9adf-73796beb783f-kube-api-access-bjtw8\") on node \"crc\" DevicePath \"\"" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.282783 4717 generic.go:334] "Generic (PLEG): container finished" podID="cafb033a-7cce-47a0-9adf-73796beb783f" containerID="f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6" exitCode=0 Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.282830 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fqs6" event={"ID":"cafb033a-7cce-47a0-9adf-73796beb783f","Type":"ContainerDied","Data":"f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6"} Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.282867 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fqs6" event={"ID":"cafb033a-7cce-47a0-9adf-73796beb783f","Type":"ContainerDied","Data":"f01ab98107090d2f582813b4c75e9584b97bd8fdb758c5505282278a05df96c8"} Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.282887 4717 scope.go:117] "RemoveContainer" containerID="f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.284084 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fqs6" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.317636 4717 scope.go:117] "RemoveContainer" containerID="6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.322420 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6fqs6"] Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.330467 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6fqs6"] Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.343301 4717 scope.go:117] "RemoveContainer" containerID="a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.392238 4717 scope.go:117] "RemoveContainer" containerID="f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6" Oct 07 14:17:42 crc kubenswrapper[4717]: E1007 14:17:42.393139 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6\": container with ID starting with f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6 not found: ID does not exist" containerID="f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.393197 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6"} err="failed to get container status \"f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6\": rpc error: code = NotFound desc = could not find container \"f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6\": container with ID starting with f01bd63670ecc3551079b8d215bdd4cebb1f5fdd4b14172c942e9da7a8f469f6 not found: ID does not exist" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.393231 4717 scope.go:117] "RemoveContainer" containerID="6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77" Oct 07 14:17:42 crc kubenswrapper[4717]: E1007 14:17:42.393881 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77\": container with ID starting with 6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77 not found: ID does not exist" containerID="6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.393911 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77"} err="failed to get container status \"6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77\": rpc error: code = NotFound desc = could not find container \"6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77\": container with ID starting with 6e2e9a8c7147c91105664589db413e5108964f152e5e4804956f0620e6232b77 not found: ID does not exist" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.393933 4717 scope.go:117] "RemoveContainer" containerID="a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376" Oct 07 14:17:42 crc kubenswrapper[4717]: E1007 14:17:42.394434 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376\": container with ID starting with a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376 not found: ID does not exist" containerID="a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.394462 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376"} err="failed to get container status \"a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376\": rpc error: code = NotFound desc = could not find container \"a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376\": container with ID starting with a4987105148455427a647c1fb97e1699ee1070007a540f530a671633438c2376 not found: ID does not exist" Oct 07 14:17:42 crc kubenswrapper[4717]: I1007 14:17:42.878431 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cafb033a-7cce-47a0-9adf-73796beb783f" path="/var/lib/kubelet/pods/cafb033a-7cce-47a0-9adf-73796beb783f/volumes" Oct 07 14:17:58 crc kubenswrapper[4717]: I1007 14:17:58.448725 4717 generic.go:334] "Generic (PLEG): container finished" podID="1da9937b-d5ca-4f21-b803-ef9121b48f23" containerID="dd26921ff2b08880502810fac6841730a680cca4741d8798e1bfce6774061e34" exitCode=0 Oct 07 14:17:58 crc kubenswrapper[4717]: I1007 14:17:58.448830 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" event={"ID":"1da9937b-d5ca-4f21-b803-ef9121b48f23","Type":"ContainerDied","Data":"dd26921ff2b08880502810fac6841730a680cca4741d8798e1bfce6774061e34"} Oct 07 14:17:59 crc kubenswrapper[4717]: I1007 14:17:59.856533 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:17:59 crc kubenswrapper[4717]: I1007 14:17:59.949262 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmj8n\" (UniqueName: \"kubernetes.io/projected/1da9937b-d5ca-4f21-b803-ef9121b48f23-kube-api-access-lmj8n\") pod \"1da9937b-d5ca-4f21-b803-ef9121b48f23\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " Oct 07 14:17:59 crc kubenswrapper[4717]: I1007 14:17:59.949337 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-inventory\") pod \"1da9937b-d5ca-4f21-b803-ef9121b48f23\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " Oct 07 14:17:59 crc kubenswrapper[4717]: I1007 14:17:59.949445 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-repo-setup-combined-ca-bundle\") pod \"1da9937b-d5ca-4f21-b803-ef9121b48f23\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " Oct 07 14:17:59 crc kubenswrapper[4717]: I1007 14:17:59.949596 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-ssh-key\") pod \"1da9937b-d5ca-4f21-b803-ef9121b48f23\" (UID: \"1da9937b-d5ca-4f21-b803-ef9121b48f23\") " Oct 07 14:17:59 crc kubenswrapper[4717]: I1007 14:17:59.955287 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1da9937b-d5ca-4f21-b803-ef9121b48f23-kube-api-access-lmj8n" (OuterVolumeSpecName: "kube-api-access-lmj8n") pod "1da9937b-d5ca-4f21-b803-ef9121b48f23" (UID: "1da9937b-d5ca-4f21-b803-ef9121b48f23"). InnerVolumeSpecName "kube-api-access-lmj8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:17:59 crc kubenswrapper[4717]: I1007 14:17:59.955966 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1da9937b-d5ca-4f21-b803-ef9121b48f23" (UID: "1da9937b-d5ca-4f21-b803-ef9121b48f23"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:17:59 crc kubenswrapper[4717]: I1007 14:17:59.979256 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1da9937b-d5ca-4f21-b803-ef9121b48f23" (UID: "1da9937b-d5ca-4f21-b803-ef9121b48f23"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:17:59 crc kubenswrapper[4717]: I1007 14:17:59.981901 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-inventory" (OuterVolumeSpecName: "inventory") pod "1da9937b-d5ca-4f21-b803-ef9121b48f23" (UID: "1da9937b-d5ca-4f21-b803-ef9121b48f23"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.052583 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.052651 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmj8n\" (UniqueName: \"kubernetes.io/projected/1da9937b-d5ca-4f21-b803-ef9121b48f23-kube-api-access-lmj8n\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.052669 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.052709 4717 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1da9937b-d5ca-4f21-b803-ef9121b48f23-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.472129 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" event={"ID":"1da9937b-d5ca-4f21-b803-ef9121b48f23","Type":"ContainerDied","Data":"053bd5fec0afeb8d60fecd4b9c67a9583b97036900d29895183f3a721f87e035"} Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.472181 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="053bd5fec0afeb8d60fecd4b9c67a9583b97036900d29895183f3a721f87e035" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.472186 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.553537 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn"] Oct 07 14:18:00 crc kubenswrapper[4717]: E1007 14:18:00.553933 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1da9937b-d5ca-4f21-b803-ef9121b48f23" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.553950 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1da9937b-d5ca-4f21-b803-ef9121b48f23" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 14:18:00 crc kubenswrapper[4717]: E1007 14:18:00.553965 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cafb033a-7cce-47a0-9adf-73796beb783f" containerName="registry-server" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.553971 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafb033a-7cce-47a0-9adf-73796beb783f" containerName="registry-server" Oct 07 14:18:00 crc kubenswrapper[4717]: E1007 14:18:00.553996 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cafb033a-7cce-47a0-9adf-73796beb783f" containerName="extract-content" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.554038 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafb033a-7cce-47a0-9adf-73796beb783f" containerName="extract-content" Oct 07 14:18:00 crc kubenswrapper[4717]: E1007 14:18:00.554056 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cafb033a-7cce-47a0-9adf-73796beb783f" containerName="extract-utilities" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.554063 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafb033a-7cce-47a0-9adf-73796beb783f" containerName="extract-utilities" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.554250 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cafb033a-7cce-47a0-9adf-73796beb783f" containerName="registry-server" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.554264 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1da9937b-d5ca-4f21-b803-ef9121b48f23" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.554920 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.557388 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.557560 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.558168 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.559417 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.580002 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn"] Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.664761 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cnp2\" (UniqueName: \"kubernetes.io/projected/394bf866-16b7-4c6a-a729-0a716c1bb5de-kube-api-access-9cnp2\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-24mzn\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.664873 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-24mzn\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.664978 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-24mzn\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.766762 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-24mzn\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.766891 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cnp2\" (UniqueName: \"kubernetes.io/projected/394bf866-16b7-4c6a-a729-0a716c1bb5de-kube-api-access-9cnp2\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-24mzn\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.766984 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-24mzn\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.774727 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-24mzn\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.784268 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cnp2\" (UniqueName: \"kubernetes.io/projected/394bf866-16b7-4c6a-a729-0a716c1bb5de-kube-api-access-9cnp2\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-24mzn\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.784681 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-24mzn\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:00 crc kubenswrapper[4717]: I1007 14:18:00.874033 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:01 crc kubenswrapper[4717]: I1007 14:18:01.290456 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn"] Oct 07 14:18:01 crc kubenswrapper[4717]: I1007 14:18:01.483248 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" event={"ID":"394bf866-16b7-4c6a-a729-0a716c1bb5de","Type":"ContainerStarted","Data":"62aeba2f9f01dfbda90febfad808b012e31e6ded625c0554960fb70655c077ce"} Oct 07 14:18:01 crc kubenswrapper[4717]: I1007 14:18:01.609606 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:18:01 crc kubenswrapper[4717]: I1007 14:18:01.609671 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:18:02 crc kubenswrapper[4717]: I1007 14:18:02.492974 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" event={"ID":"394bf866-16b7-4c6a-a729-0a716c1bb5de","Type":"ContainerStarted","Data":"7ace052c965cff4d72f03d7c9db4492715ef82d3edaa80bfecb48abe82e79a47"} Oct 07 14:18:02 crc kubenswrapper[4717]: I1007 14:18:02.514763 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" podStartSLOduration=1.846656296 podStartE2EDuration="2.514738657s" podCreationTimestamp="2025-10-07 14:18:00 +0000 UTC" firstStartedPulling="2025-10-07 14:18:01.300179023 +0000 UTC m=+1463.128104825" lastFinishedPulling="2025-10-07 14:18:01.968261394 +0000 UTC m=+1463.796187186" observedRunningTime="2025-10-07 14:18:02.509506963 +0000 UTC m=+1464.337432745" watchObservedRunningTime="2025-10-07 14:18:02.514738657 +0000 UTC m=+1464.342664439" Oct 07 14:18:06 crc kubenswrapper[4717]: I1007 14:18:06.528079 4717 generic.go:334] "Generic (PLEG): container finished" podID="394bf866-16b7-4c6a-a729-0a716c1bb5de" containerID="7ace052c965cff4d72f03d7c9db4492715ef82d3edaa80bfecb48abe82e79a47" exitCode=0 Oct 07 14:18:06 crc kubenswrapper[4717]: I1007 14:18:06.528169 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" event={"ID":"394bf866-16b7-4c6a-a729-0a716c1bb5de","Type":"ContainerDied","Data":"7ace052c965cff4d72f03d7c9db4492715ef82d3edaa80bfecb48abe82e79a47"} Oct 07 14:18:07 crc kubenswrapper[4717]: I1007 14:18:07.958451 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.017225 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-inventory\") pod \"394bf866-16b7-4c6a-a729-0a716c1bb5de\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.017274 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-ssh-key\") pod \"394bf866-16b7-4c6a-a729-0a716c1bb5de\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.017323 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cnp2\" (UniqueName: \"kubernetes.io/projected/394bf866-16b7-4c6a-a729-0a716c1bb5de-kube-api-access-9cnp2\") pod \"394bf866-16b7-4c6a-a729-0a716c1bb5de\" (UID: \"394bf866-16b7-4c6a-a729-0a716c1bb5de\") " Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.023863 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394bf866-16b7-4c6a-a729-0a716c1bb5de-kube-api-access-9cnp2" (OuterVolumeSpecName: "kube-api-access-9cnp2") pod "394bf866-16b7-4c6a-a729-0a716c1bb5de" (UID: "394bf866-16b7-4c6a-a729-0a716c1bb5de"). InnerVolumeSpecName "kube-api-access-9cnp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.045703 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-inventory" (OuterVolumeSpecName: "inventory") pod "394bf866-16b7-4c6a-a729-0a716c1bb5de" (UID: "394bf866-16b7-4c6a-a729-0a716c1bb5de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.047546 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "394bf866-16b7-4c6a-a729-0a716c1bb5de" (UID: "394bf866-16b7-4c6a-a729-0a716c1bb5de"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.120104 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.120142 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/394bf866-16b7-4c6a-a729-0a716c1bb5de-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.120151 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cnp2\" (UniqueName: \"kubernetes.io/projected/394bf866-16b7-4c6a-a729-0a716c1bb5de-kube-api-access-9cnp2\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.559117 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" event={"ID":"394bf866-16b7-4c6a-a729-0a716c1bb5de","Type":"ContainerDied","Data":"62aeba2f9f01dfbda90febfad808b012e31e6ded625c0554960fb70655c077ce"} Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.559424 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62aeba2f9f01dfbda90febfad808b012e31e6ded625c0554960fb70655c077ce" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.559171 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-24mzn" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.623622 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp"] Oct 07 14:18:08 crc kubenswrapper[4717]: E1007 14:18:08.624471 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394bf866-16b7-4c6a-a729-0a716c1bb5de" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.624495 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="394bf866-16b7-4c6a-a729-0a716c1bb5de" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.624722 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="394bf866-16b7-4c6a-a729-0a716c1bb5de" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.626515 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.628826 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.629053 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.629160 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.629263 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.638979 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp"] Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.731681 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.731744 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.731879 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.732025 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2cvb\" (UniqueName: \"kubernetes.io/projected/026b85b1-41cc-4a6f-9638-909bc0e6099e-kube-api-access-w2cvb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.834167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.834257 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.834327 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.834474 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2cvb\" (UniqueName: \"kubernetes.io/projected/026b85b1-41cc-4a6f-9638-909bc0e6099e-kube-api-access-w2cvb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.840973 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.841095 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.846660 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.852812 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2cvb\" (UniqueName: \"kubernetes.io/projected/026b85b1-41cc-4a6f-9638-909bc0e6099e-kube-api-access-w2cvb\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:08 crc kubenswrapper[4717]: I1007 14:18:08.948510 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:18:09 crc kubenswrapper[4717]: I1007 14:18:09.490235 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp"] Oct 07 14:18:09 crc kubenswrapper[4717]: I1007 14:18:09.577387 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" event={"ID":"026b85b1-41cc-4a6f-9638-909bc0e6099e","Type":"ContainerStarted","Data":"00fca65af167595aadb2c40eef9370d49011c1567fc10175468d95f4bf326a5a"} Oct 07 14:18:10 crc kubenswrapper[4717]: I1007 14:18:10.587602 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" event={"ID":"026b85b1-41cc-4a6f-9638-909bc0e6099e","Type":"ContainerStarted","Data":"4ca17c1bd2118324062d1541d42b1aaa7bc025dead169dbbe92fc5c188444591"} Oct 07 14:18:10 crc kubenswrapper[4717]: I1007 14:18:10.612630 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" podStartSLOduration=2.217830884 podStartE2EDuration="2.612601891s" podCreationTimestamp="2025-10-07 14:18:08 +0000 UTC" firstStartedPulling="2025-10-07 14:18:09.490876143 +0000 UTC m=+1471.318801935" lastFinishedPulling="2025-10-07 14:18:09.88564715 +0000 UTC m=+1471.713572942" observedRunningTime="2025-10-07 14:18:10.605075404 +0000 UTC m=+1472.433001196" watchObservedRunningTime="2025-10-07 14:18:10.612601891 +0000 UTC m=+1472.440527683" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.604495 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w7mr5"] Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.607931 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.618963 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w7mr5"] Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.679774 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-utilities\") pod \"community-operators-w7mr5\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.680059 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-catalog-content\") pod \"community-operators-w7mr5\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.680196 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29jsd\" (UniqueName: \"kubernetes.io/projected/f8dbe1db-f888-4708-b279-9463629ca0ee-kube-api-access-29jsd\") pod \"community-operators-w7mr5\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.782076 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-utilities\") pod \"community-operators-w7mr5\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.782185 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-catalog-content\") pod \"community-operators-w7mr5\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.782249 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29jsd\" (UniqueName: \"kubernetes.io/projected/f8dbe1db-f888-4708-b279-9463629ca0ee-kube-api-access-29jsd\") pod \"community-operators-w7mr5\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.782565 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-utilities\") pod \"community-operators-w7mr5\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.782681 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-catalog-content\") pod \"community-operators-w7mr5\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.805548 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29jsd\" (UniqueName: \"kubernetes.io/projected/f8dbe1db-f888-4708-b279-9463629ca0ee-kube-api-access-29jsd\") pod \"community-operators-w7mr5\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:24 crc kubenswrapper[4717]: I1007 14:18:24.951052 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:25 crc kubenswrapper[4717]: I1007 14:18:25.537909 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w7mr5"] Oct 07 14:18:25 crc kubenswrapper[4717]: I1007 14:18:25.725666 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7mr5" event={"ID":"f8dbe1db-f888-4708-b279-9463629ca0ee","Type":"ContainerStarted","Data":"e5f3a0059d98aed5ec39471ed3437e5e24c3ab3db61fde259d6abc543301c258"} Oct 07 14:18:26 crc kubenswrapper[4717]: I1007 14:18:26.736790 4717 generic.go:334] "Generic (PLEG): container finished" podID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerID="3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34" exitCode=0 Oct 07 14:18:26 crc kubenswrapper[4717]: I1007 14:18:26.736896 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7mr5" event={"ID":"f8dbe1db-f888-4708-b279-9463629ca0ee","Type":"ContainerDied","Data":"3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34"} Oct 07 14:18:27 crc kubenswrapper[4717]: I1007 14:18:27.750503 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7mr5" event={"ID":"f8dbe1db-f888-4708-b279-9463629ca0ee","Type":"ContainerStarted","Data":"069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478"} Oct 07 14:18:28 crc kubenswrapper[4717]: I1007 14:18:28.760322 4717 generic.go:334] "Generic (PLEG): container finished" podID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerID="069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478" exitCode=0 Oct 07 14:18:28 crc kubenswrapper[4717]: I1007 14:18:28.760366 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7mr5" event={"ID":"f8dbe1db-f888-4708-b279-9463629ca0ee","Type":"ContainerDied","Data":"069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478"} Oct 07 14:18:29 crc kubenswrapper[4717]: I1007 14:18:29.772368 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7mr5" event={"ID":"f8dbe1db-f888-4708-b279-9463629ca0ee","Type":"ContainerStarted","Data":"e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c"} Oct 07 14:18:29 crc kubenswrapper[4717]: I1007 14:18:29.790281 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w7mr5" podStartSLOduration=3.302797981 podStartE2EDuration="5.790263275s" podCreationTimestamp="2025-10-07 14:18:24 +0000 UTC" firstStartedPulling="2025-10-07 14:18:26.739252478 +0000 UTC m=+1488.567178270" lastFinishedPulling="2025-10-07 14:18:29.226717772 +0000 UTC m=+1491.054643564" observedRunningTime="2025-10-07 14:18:29.78753904 +0000 UTC m=+1491.615464862" watchObservedRunningTime="2025-10-07 14:18:29.790263275 +0000 UTC m=+1491.618189067" Oct 07 14:18:31 crc kubenswrapper[4717]: I1007 14:18:31.609682 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:18:31 crc kubenswrapper[4717]: I1007 14:18:31.610031 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:18:34 crc kubenswrapper[4717]: I1007 14:18:34.951447 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:34 crc kubenswrapper[4717]: I1007 14:18:34.951782 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:35 crc kubenswrapper[4717]: I1007 14:18:35.012537 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:35 crc kubenswrapper[4717]: I1007 14:18:35.873255 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:35 crc kubenswrapper[4717]: I1007 14:18:35.920387 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w7mr5"] Oct 07 14:18:37 crc kubenswrapper[4717]: I1007 14:18:37.839339 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w7mr5" podUID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerName="registry-server" containerID="cri-o://e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c" gracePeriod=2 Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.285125 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.365745 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-catalog-content\") pod \"f8dbe1db-f888-4708-b279-9463629ca0ee\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.365845 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29jsd\" (UniqueName: \"kubernetes.io/projected/f8dbe1db-f888-4708-b279-9463629ca0ee-kube-api-access-29jsd\") pod \"f8dbe1db-f888-4708-b279-9463629ca0ee\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.366027 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-utilities\") pod \"f8dbe1db-f888-4708-b279-9463629ca0ee\" (UID: \"f8dbe1db-f888-4708-b279-9463629ca0ee\") " Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.367037 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-utilities" (OuterVolumeSpecName: "utilities") pod "f8dbe1db-f888-4708-b279-9463629ca0ee" (UID: "f8dbe1db-f888-4708-b279-9463629ca0ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.371293 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dbe1db-f888-4708-b279-9463629ca0ee-kube-api-access-29jsd" (OuterVolumeSpecName: "kube-api-access-29jsd") pod "f8dbe1db-f888-4708-b279-9463629ca0ee" (UID: "f8dbe1db-f888-4708-b279-9463629ca0ee"). InnerVolumeSpecName "kube-api-access-29jsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.419158 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8dbe1db-f888-4708-b279-9463629ca0ee" (UID: "f8dbe1db-f888-4708-b279-9463629ca0ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.468590 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.468639 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8dbe1db-f888-4708-b279-9463629ca0ee-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.468654 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29jsd\" (UniqueName: \"kubernetes.io/projected/f8dbe1db-f888-4708-b279-9463629ca0ee-kube-api-access-29jsd\") on node \"crc\" DevicePath \"\"" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.850994 4717 generic.go:334] "Generic (PLEG): container finished" podID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerID="e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c" exitCode=0 Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.851079 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w7mr5" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.851101 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7mr5" event={"ID":"f8dbe1db-f888-4708-b279-9463629ca0ee","Type":"ContainerDied","Data":"e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c"} Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.851597 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w7mr5" event={"ID":"f8dbe1db-f888-4708-b279-9463629ca0ee","Type":"ContainerDied","Data":"e5f3a0059d98aed5ec39471ed3437e5e24c3ab3db61fde259d6abc543301c258"} Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.851615 4717 scope.go:117] "RemoveContainer" containerID="e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.882373 4717 scope.go:117] "RemoveContainer" containerID="069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.920509 4717 scope.go:117] "RemoveContainer" containerID="3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.922736 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w7mr5"] Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.932219 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w7mr5"] Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.957346 4717 scope.go:117] "RemoveContainer" containerID="e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c" Oct 07 14:18:38 crc kubenswrapper[4717]: E1007 14:18:38.965730 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c\": container with ID starting with e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c not found: ID does not exist" containerID="e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.965797 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c"} err="failed to get container status \"e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c\": rpc error: code = NotFound desc = could not find container \"e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c\": container with ID starting with e55d699345c5f58995b3c28f8ee33d57af5b09d2613a20c2333bb8f24019b35c not found: ID does not exist" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.965826 4717 scope.go:117] "RemoveContainer" containerID="069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478" Oct 07 14:18:38 crc kubenswrapper[4717]: E1007 14:18:38.966414 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478\": container with ID starting with 069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478 not found: ID does not exist" containerID="069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.966471 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478"} err="failed to get container status \"069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478\": rpc error: code = NotFound desc = could not find container \"069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478\": container with ID starting with 069de0895da057b4d07130f2b5875ab5892f8723a006d3d1cba1d3f09c1b7478 not found: ID does not exist" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.966486 4717 scope.go:117] "RemoveContainer" containerID="3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34" Oct 07 14:18:38 crc kubenswrapper[4717]: E1007 14:18:38.966778 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34\": container with ID starting with 3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34 not found: ID does not exist" containerID="3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34" Oct 07 14:18:38 crc kubenswrapper[4717]: I1007 14:18:38.966805 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34"} err="failed to get container status \"3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34\": rpc error: code = NotFound desc = could not find container \"3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34\": container with ID starting with 3479741711c944854a4b04f1c76b7feb09561168c7965516b10c5367f893ab34 not found: ID does not exist" Oct 07 14:18:40 crc kubenswrapper[4717]: I1007 14:18:40.885309 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8dbe1db-f888-4708-b279-9463629ca0ee" path="/var/lib/kubelet/pods/f8dbe1db-f888-4708-b279-9463629ca0ee/volumes" Oct 07 14:18:41 crc kubenswrapper[4717]: I1007 14:18:41.319612 4717 scope.go:117] "RemoveContainer" containerID="1be9c8727e96c507158a78f88891832a24bc2136e825aa65da4b1531801f787f" Oct 07 14:18:41 crc kubenswrapper[4717]: I1007 14:18:41.495231 4717 scope.go:117] "RemoveContainer" containerID="154cac6ff0aa1e601735fe658a0b5326673b4765554107198a72eb49227ad728" Oct 07 14:18:41 crc kubenswrapper[4717]: I1007 14:18:41.531649 4717 scope.go:117] "RemoveContainer" containerID="de5625b664832e995fc93a9a6085e4a0819c974d6527e6d58b235012ed3d3157" Oct 07 14:18:45 crc kubenswrapper[4717]: E1007 14:18:45.232785 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice/crio-e5f3a0059d98aed5ec39471ed3437e5e24c3ab3db61fde259d6abc543301c258\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice\": RecentStats: unable to find data in memory cache]" Oct 07 14:18:55 crc kubenswrapper[4717]: E1007 14:18:55.510503 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice/crio-e5f3a0059d98aed5ec39471ed3437e5e24c3ab3db61fde259d6abc543301c258\": RecentStats: unable to find data in memory cache]" Oct 07 14:19:01 crc kubenswrapper[4717]: I1007 14:19:01.609331 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:19:01 crc kubenswrapper[4717]: I1007 14:19:01.609817 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:19:01 crc kubenswrapper[4717]: I1007 14:19:01.609854 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:19:01 crc kubenswrapper[4717]: I1007 14:19:01.610542 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:19:01 crc kubenswrapper[4717]: I1007 14:19:01.610592 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" gracePeriod=600 Oct 07 14:19:01 crc kubenswrapper[4717]: E1007 14:19:01.730959 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:19:02 crc kubenswrapper[4717]: I1007 14:19:02.084636 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" exitCode=0 Oct 07 14:19:02 crc kubenswrapper[4717]: I1007 14:19:02.084752 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86"} Oct 07 14:19:02 crc kubenswrapper[4717]: I1007 14:19:02.085070 4717 scope.go:117] "RemoveContainer" containerID="80da36f335297e73db206108880448f662f17a9582de4598a70a5b6e5e4985c0" Oct 07 14:19:02 crc kubenswrapper[4717]: I1007 14:19:02.086691 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:19:02 crc kubenswrapper[4717]: E1007 14:19:02.087529 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:19:05 crc kubenswrapper[4717]: E1007 14:19:05.758243 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice/crio-e5f3a0059d98aed5ec39471ed3437e5e24c3ab3db61fde259d6abc543301c258\": RecentStats: unable to find data in memory cache]" Oct 07 14:19:15 crc kubenswrapper[4717]: I1007 14:19:15.868151 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:19:15 crc kubenswrapper[4717]: E1007 14:19:15.868868 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:19:16 crc kubenswrapper[4717]: E1007 14:19:16.029912 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice/crio-e5f3a0059d98aed5ec39471ed3437e5e24c3ab3db61fde259d6abc543301c258\": RecentStats: unable to find data in memory cache]" Oct 07 14:19:26 crc kubenswrapper[4717]: E1007 14:19:26.274828 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice/crio-e5f3a0059d98aed5ec39471ed3437e5e24c3ab3db61fde259d6abc543301c258\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice\": RecentStats: unable to find data in memory cache]" Oct 07 14:19:30 crc kubenswrapper[4717]: I1007 14:19:30.868668 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:19:30 crc kubenswrapper[4717]: E1007 14:19:30.869876 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:19:36 crc kubenswrapper[4717]: E1007 14:19:36.555499 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dbe1db_f888_4708_b279_9463629ca0ee.slice/crio-e5f3a0059d98aed5ec39471ed3437e5e24c3ab3db61fde259d6abc543301c258\": RecentStats: unable to find data in memory cache]" Oct 07 14:19:41 crc kubenswrapper[4717]: I1007 14:19:41.640808 4717 scope.go:117] "RemoveContainer" containerID="586132d8ba4ee20901a707b885dca94c20b37ea13f570eaed63b4fd3a257e297" Oct 07 14:19:41 crc kubenswrapper[4717]: I1007 14:19:41.693490 4717 scope.go:117] "RemoveContainer" containerID="104456aa6bf6f58617ecb124f755829e0bc240f3713eda0f24ac039e06194c48" Oct 07 14:19:43 crc kubenswrapper[4717]: I1007 14:19:43.869661 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:19:43 crc kubenswrapper[4717]: E1007 14:19:43.870511 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:19:57 crc kubenswrapper[4717]: I1007 14:19:57.868910 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:19:57 crc kubenswrapper[4717]: E1007 14:19:57.869816 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:20:11 crc kubenswrapper[4717]: I1007 14:20:11.867949 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:20:11 crc kubenswrapper[4717]: E1007 14:20:11.868664 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:20:26 crc kubenswrapper[4717]: I1007 14:20:26.869281 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:20:26 crc kubenswrapper[4717]: E1007 14:20:26.870945 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:20:41 crc kubenswrapper[4717]: I1007 14:20:41.761043 4717 scope.go:117] "RemoveContainer" containerID="3d5b8fb06e24ec12ee704c978682cc77f8ac65ea3492d500405afb487e1c6eab" Oct 07 14:20:41 crc kubenswrapper[4717]: I1007 14:20:41.869044 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:20:41 crc kubenswrapper[4717]: E1007 14:20:41.869378 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:20:52 crc kubenswrapper[4717]: I1007 14:20:52.868274 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:20:52 crc kubenswrapper[4717]: E1007 14:20:52.869187 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:21:02 crc kubenswrapper[4717]: I1007 14:21:02.079953 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-5flzp"] Oct 07 14:21:02 crc kubenswrapper[4717]: I1007 14:21:02.090130 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4qpnj"] Oct 07 14:21:02 crc kubenswrapper[4717]: I1007 14:21:02.101442 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-5flzp"] Oct 07 14:21:02 crc kubenswrapper[4717]: I1007 14:21:02.113356 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4qpnj"] Oct 07 14:21:02 crc kubenswrapper[4717]: I1007 14:21:02.890224 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e50833-7e85-44b4-926f-e088d5a065fc" path="/var/lib/kubelet/pods/69e50833-7e85-44b4-926f-e088d5a065fc/volumes" Oct 07 14:21:02 crc kubenswrapper[4717]: I1007 14:21:02.893638 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c364d515-73f8-4e3f-9fde-a6f583e3a807" path="/var/lib/kubelet/pods/c364d515-73f8-4e3f-9fde-a6f583e3a807/volumes" Oct 07 14:21:05 crc kubenswrapper[4717]: I1007 14:21:05.026905 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-kdghz"] Oct 07 14:21:05 crc kubenswrapper[4717]: I1007 14:21:05.037241 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-kdghz"] Oct 07 14:21:06 crc kubenswrapper[4717]: I1007 14:21:06.869109 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:21:06 crc kubenswrapper[4717]: E1007 14:21:06.869645 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:21:06 crc kubenswrapper[4717]: I1007 14:21:06.880136 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5961e0b0-7403-4c58-ae35-1609815f5c4b" path="/var/lib/kubelet/pods/5961e0b0-7403-4c58-ae35-1609815f5c4b/volumes" Oct 07 14:21:11 crc kubenswrapper[4717]: I1007 14:21:11.039771 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d269-account-create-9kgwm"] Oct 07 14:21:11 crc kubenswrapper[4717]: I1007 14:21:11.051411 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d269-account-create-9kgwm"] Oct 07 14:21:12 crc kubenswrapper[4717]: I1007 14:21:12.882048 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed7e8d9-d828-4710-be5c-91e84e17a057" path="/var/lib/kubelet/pods/eed7e8d9-d828-4710-be5c-91e84e17a057/volumes" Oct 07 14:21:16 crc kubenswrapper[4717]: I1007 14:21:16.030616 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bf89-account-create-h42p7"] Oct 07 14:21:16 crc kubenswrapper[4717]: I1007 14:21:16.039678 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bf89-account-create-h42p7"] Oct 07 14:21:16 crc kubenswrapper[4717]: I1007 14:21:16.878413 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e58acfa-d9c6-44d0-bff3-b12663e9095e" path="/var/lib/kubelet/pods/6e58acfa-d9c6-44d0-bff3-b12663e9095e/volumes" Oct 07 14:21:21 crc kubenswrapper[4717]: I1007 14:21:21.029185 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-0fb7-account-create-h62sp"] Oct 07 14:21:21 crc kubenswrapper[4717]: I1007 14:21:21.038199 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-0fb7-account-create-h62sp"] Oct 07 14:21:21 crc kubenswrapper[4717]: I1007 14:21:21.868864 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:21:21 crc kubenswrapper[4717]: E1007 14:21:21.869345 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:21:22 crc kubenswrapper[4717]: I1007 14:21:22.880559 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a976f8f9-576f-4e2d-bf4c-f2bed066725b" path="/var/lib/kubelet/pods/a976f8f9-576f-4e2d-bf4c-f2bed066725b/volumes" Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.047122 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-5d62c"] Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.057888 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-rmvdk"] Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.068898 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-dq9k9"] Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.079773 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-l64ln"] Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.088164 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-l64ln"] Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.096274 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-rmvdk"] Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.104500 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-dq9k9"] Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.134981 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-5d62c"] Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.868968 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:21:36 crc kubenswrapper[4717]: E1007 14:21:36.869492 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.880270 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="193ab1fd-3c4e-463b-9f5f-a85cbe1fd633" path="/var/lib/kubelet/pods/193ab1fd-3c4e-463b-9f5f-a85cbe1fd633/volumes" Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.881557 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c14c91-5e26-48d3-86c8-79dcb89d3c16" path="/var/lib/kubelet/pods/93c14c91-5e26-48d3-86c8-79dcb89d3c16/volumes" Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.883408 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e05b05a1-ebbd-4437-aa14-2b6736950306" path="/var/lib/kubelet/pods/e05b05a1-ebbd-4437-aa14-2b6736950306/volumes" Oct 07 14:21:36 crc kubenswrapper[4717]: I1007 14:21:36.884354 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e60f513f-7654-4b4a-b4ef-ea4637a7f364" path="/var/lib/kubelet/pods/e60f513f-7654-4b4a-b4ef-ea4637a7f364/volumes" Oct 07 14:21:40 crc kubenswrapper[4717]: I1007 14:21:40.039083 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jkk7w"] Oct 07 14:21:40 crc kubenswrapper[4717]: I1007 14:21:40.055701 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-8kmdr"] Oct 07 14:21:40 crc kubenswrapper[4717]: I1007 14:21:40.064215 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-8kmdr"] Oct 07 14:21:40 crc kubenswrapper[4717]: I1007 14:21:40.072369 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jkk7w"] Oct 07 14:21:40 crc kubenswrapper[4717]: I1007 14:21:40.879877 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="842ad804-63f4-4fb3-9f2e-f7e70c91f3a5" path="/var/lib/kubelet/pods/842ad804-63f4-4fb3-9f2e-f7e70c91f3a5/volumes" Oct 07 14:21:40 crc kubenswrapper[4717]: I1007 14:21:40.881274 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c68042ef-a515-4c4a-b1d5-82fcc3549ea6" path="/var/lib/kubelet/pods/c68042ef-a515-4c4a-b1d5-82fcc3549ea6/volumes" Oct 07 14:21:41 crc kubenswrapper[4717]: I1007 14:21:41.802342 4717 scope.go:117] "RemoveContainer" containerID="c4b247182175d5c2daa2eac6e3b519f8495244bbffb48463bbf11859dde1a8dd" Oct 07 14:21:41 crc kubenswrapper[4717]: I1007 14:21:41.825770 4717 scope.go:117] "RemoveContainer" containerID="20ebb6863b62e62972bb58f55e349ce82e4f592f72de0a2a477eefe6988852ee" Oct 07 14:21:41 crc kubenswrapper[4717]: I1007 14:21:41.856198 4717 scope.go:117] "RemoveContainer" containerID="d399456bc67fe32372b87ff9964ee175f2a2876534418938d4d0d674e70cb5a1" Oct 07 14:21:41 crc kubenswrapper[4717]: I1007 14:21:41.896932 4717 scope.go:117] "RemoveContainer" containerID="457b4967b8bd1f9dd527d2982aa50fb134455c69fc8a81b663b070171a9751f0" Oct 07 14:21:41 crc kubenswrapper[4717]: I1007 14:21:41.976985 4717 scope.go:117] "RemoveContainer" containerID="37e0eb90e1710d6d1246629f4fe05af36cb0ae73d218124feac13fd54755bd53" Oct 07 14:21:41 crc kubenswrapper[4717]: I1007 14:21:41.997956 4717 scope.go:117] "RemoveContainer" containerID="1a2ca749888268961419e07265521cf4631ed355bcdb7e4902963cf7249fee16" Oct 07 14:21:42 crc kubenswrapper[4717]: I1007 14:21:42.067914 4717 scope.go:117] "RemoveContainer" containerID="807e6086481132bbd39a602a72f55a985855bb36106610a0040511af613d0c5d" Oct 07 14:21:42 crc kubenswrapper[4717]: I1007 14:21:42.099230 4717 scope.go:117] "RemoveContainer" containerID="e286a66694dbfb7cc28174d045b746e8d7dcc189935265bf4e2d671d25aac995" Oct 07 14:21:42 crc kubenswrapper[4717]: I1007 14:21:42.127952 4717 scope.go:117] "RemoveContainer" containerID="4454fe08906d672ec4fca90e12cdb29a788254b640a5e9c8acba8b50d65abe2e" Oct 07 14:21:42 crc kubenswrapper[4717]: I1007 14:21:42.181312 4717 scope.go:117] "RemoveContainer" containerID="62c02fefd3080b733dbf933a450921789724e4301348ad6d7bc0d29c4e853849" Oct 07 14:21:42 crc kubenswrapper[4717]: I1007 14:21:42.220033 4717 scope.go:117] "RemoveContainer" containerID="9f203292f2c2f5c8988fb8077209b7b1fba52c253faac8718fd7871b2117a0b8" Oct 07 14:21:42 crc kubenswrapper[4717]: I1007 14:21:42.255165 4717 scope.go:117] "RemoveContainer" containerID="5148104e540ffb1d2141d384ee907c2ea28a83f8f8fe77595f9121ae58c67c05" Oct 07 14:21:42 crc kubenswrapper[4717]: I1007 14:21:42.380318 4717 scope.go:117] "RemoveContainer" containerID="375756edbedb6d7becdb928395e15978f91c835d1390007026c0238bde491bc2" Oct 07 14:21:42 crc kubenswrapper[4717]: I1007 14:21:42.404505 4717 scope.go:117] "RemoveContainer" containerID="86d6c2d66b82940e46906fda1852fd5202cdbe872128d88161e1ed60eb1f3bb6" Oct 07 14:21:44 crc kubenswrapper[4717]: I1007 14:21:44.508796 4717 generic.go:334] "Generic (PLEG): container finished" podID="026b85b1-41cc-4a6f-9638-909bc0e6099e" containerID="4ca17c1bd2118324062d1541d42b1aaa7bc025dead169dbbe92fc5c188444591" exitCode=0 Oct 07 14:21:44 crc kubenswrapper[4717]: I1007 14:21:44.508906 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" event={"ID":"026b85b1-41cc-4a6f-9638-909bc0e6099e","Type":"ContainerDied","Data":"4ca17c1bd2118324062d1541d42b1aaa7bc025dead169dbbe92fc5c188444591"} Oct 07 14:21:45 crc kubenswrapper[4717]: I1007 14:21:45.936343 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:21:45 crc kubenswrapper[4717]: I1007 14:21:45.990448 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2cvb\" (UniqueName: \"kubernetes.io/projected/026b85b1-41cc-4a6f-9638-909bc0e6099e-kube-api-access-w2cvb\") pod \"026b85b1-41cc-4a6f-9638-909bc0e6099e\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " Oct 07 14:21:45 crc kubenswrapper[4717]: I1007 14:21:45.990663 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-inventory\") pod \"026b85b1-41cc-4a6f-9638-909bc0e6099e\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " Oct 07 14:21:45 crc kubenswrapper[4717]: I1007 14:21:45.990712 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-bootstrap-combined-ca-bundle\") pod \"026b85b1-41cc-4a6f-9638-909bc0e6099e\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " Oct 07 14:21:45 crc kubenswrapper[4717]: I1007 14:21:45.990741 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-ssh-key\") pod \"026b85b1-41cc-4a6f-9638-909bc0e6099e\" (UID: \"026b85b1-41cc-4a6f-9638-909bc0e6099e\") " Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.000857 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026b85b1-41cc-4a6f-9638-909bc0e6099e-kube-api-access-w2cvb" (OuterVolumeSpecName: "kube-api-access-w2cvb") pod "026b85b1-41cc-4a6f-9638-909bc0e6099e" (UID: "026b85b1-41cc-4a6f-9638-909bc0e6099e"). InnerVolumeSpecName "kube-api-access-w2cvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.006275 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "026b85b1-41cc-4a6f-9638-909bc0e6099e" (UID: "026b85b1-41cc-4a6f-9638-909bc0e6099e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.024616 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-inventory" (OuterVolumeSpecName: "inventory") pod "026b85b1-41cc-4a6f-9638-909bc0e6099e" (UID: "026b85b1-41cc-4a6f-9638-909bc0e6099e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.027259 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "026b85b1-41cc-4a6f-9638-909bc0e6099e" (UID: "026b85b1-41cc-4a6f-9638-909bc0e6099e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.094754 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2cvb\" (UniqueName: \"kubernetes.io/projected/026b85b1-41cc-4a6f-9638-909bc0e6099e-kube-api-access-w2cvb\") on node \"crc\" DevicePath \"\"" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.095084 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.095173 4717 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.095252 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/026b85b1-41cc-4a6f-9638-909bc0e6099e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.528394 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" event={"ID":"026b85b1-41cc-4a6f-9638-909bc0e6099e","Type":"ContainerDied","Data":"00fca65af167595aadb2c40eef9370d49011c1567fc10175468d95f4bf326a5a"} Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.528442 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00fca65af167595aadb2c40eef9370d49011c1567fc10175468d95f4bf326a5a" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.528492 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.634337 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l"] Oct 07 14:21:46 crc kubenswrapper[4717]: E1007 14:21:46.635045 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="026b85b1-41cc-4a6f-9638-909bc0e6099e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.635064 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="026b85b1-41cc-4a6f-9638-909bc0e6099e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 14:21:46 crc kubenswrapper[4717]: E1007 14:21:46.635095 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerName="extract-content" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.635103 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerName="extract-content" Oct 07 14:21:46 crc kubenswrapper[4717]: E1007 14:21:46.635116 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerName="extract-utilities" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.635121 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerName="extract-utilities" Oct 07 14:21:46 crc kubenswrapper[4717]: E1007 14:21:46.635132 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerName="registry-server" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.635138 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerName="registry-server" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.635304 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="026b85b1-41cc-4a6f-9638-909bc0e6099e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.635327 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8dbe1db-f888-4708-b279-9463629ca0ee" containerName="registry-server" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.636038 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.638663 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.638899 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.638968 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.639104 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.651720 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l"] Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.706403 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9hjx\" (UniqueName: \"kubernetes.io/projected/769125a8-8870-4ddf-86e3-cb1bfa198b41-kube-api-access-k9hjx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.706523 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.706711 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.808769 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.808895 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9hjx\" (UniqueName: \"kubernetes.io/projected/769125a8-8870-4ddf-86e3-cb1bfa198b41-kube-api-access-k9hjx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.808939 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.824028 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.825307 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.825837 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9hjx\" (UniqueName: \"kubernetes.io/projected/769125a8-8870-4ddf-86e3-cb1bfa198b41-kube-api-access-k9hjx\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:46 crc kubenswrapper[4717]: I1007 14:21:46.955511 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:21:47 crc kubenswrapper[4717]: I1007 14:21:47.452771 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l"] Oct 07 14:21:47 crc kubenswrapper[4717]: I1007 14:21:47.455214 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:21:47 crc kubenswrapper[4717]: I1007 14:21:47.542796 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" event={"ID":"769125a8-8870-4ddf-86e3-cb1bfa198b41","Type":"ContainerStarted","Data":"6ae367bca32578bc524520cc26e2c17cf8818a725f9ba252f7eda35d655fe06e"} Oct 07 14:21:48 crc kubenswrapper[4717]: I1007 14:21:48.554162 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" event={"ID":"769125a8-8870-4ddf-86e3-cb1bfa198b41","Type":"ContainerStarted","Data":"dce8924f1cc075007e6b926fa847518a840c4e22b957c65c25bcdb2403955ed8"} Oct 07 14:21:48 crc kubenswrapper[4717]: I1007 14:21:48.571349 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" podStartSLOduration=2.155735348 podStartE2EDuration="2.571332434s" podCreationTimestamp="2025-10-07 14:21:46 +0000 UTC" firstStartedPulling="2025-10-07 14:21:47.454982141 +0000 UTC m=+1689.282907933" lastFinishedPulling="2025-10-07 14:21:47.870579227 +0000 UTC m=+1689.698505019" observedRunningTime="2025-10-07 14:21:48.570424809 +0000 UTC m=+1690.398350601" watchObservedRunningTime="2025-10-07 14:21:48.571332434 +0000 UTC m=+1690.399258226" Oct 07 14:21:50 crc kubenswrapper[4717]: I1007 14:21:50.869143 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:21:50 crc kubenswrapper[4717]: E1007 14:21:50.869782 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:21:51 crc kubenswrapper[4717]: I1007 14:21:51.028472 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8cd3-account-create-h8gpd"] Oct 07 14:21:51 crc kubenswrapper[4717]: I1007 14:21:51.038989 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8cd3-account-create-h8gpd"] Oct 07 14:21:52 crc kubenswrapper[4717]: I1007 14:21:52.881774 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc43ea6-a0ae-48a5-9c55-4b932864ea43" path="/var/lib/kubelet/pods/9dc43ea6-a0ae-48a5-9c55-4b932864ea43/volumes" Oct 07 14:22:01 crc kubenswrapper[4717]: I1007 14:22:01.048587 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e87f-account-create-m8mwt"] Oct 07 14:22:01 crc kubenswrapper[4717]: I1007 14:22:01.062177 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-7b91-account-create-5dp6f"] Oct 07 14:22:01 crc kubenswrapper[4717]: I1007 14:22:01.071665 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4f33-account-create-rpvq6"] Oct 07 14:22:01 crc kubenswrapper[4717]: I1007 14:22:01.083236 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-7b91-account-create-5dp6f"] Oct 07 14:22:01 crc kubenswrapper[4717]: I1007 14:22:01.093575 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4f33-account-create-rpvq6"] Oct 07 14:22:01 crc kubenswrapper[4717]: I1007 14:22:01.103997 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e87f-account-create-m8mwt"] Oct 07 14:22:02 crc kubenswrapper[4717]: I1007 14:22:02.881753 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bda80bb-339d-4c40-a8cf-91dc994bcc15" path="/var/lib/kubelet/pods/0bda80bb-339d-4c40-a8cf-91dc994bcc15/volumes" Oct 07 14:22:02 crc kubenswrapper[4717]: I1007 14:22:02.883395 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1189ffe2-55f3-4da5-9cf5-2a871828cfa3" path="/var/lib/kubelet/pods/1189ffe2-55f3-4da5-9cf5-2a871828cfa3/volumes" Oct 07 14:22:02 crc kubenswrapper[4717]: I1007 14:22:02.884886 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce8237ed-de9c-4c53-a716-de316c22554b" path="/var/lib/kubelet/pods/ce8237ed-de9c-4c53-a716-de316c22554b/volumes" Oct 07 14:22:04 crc kubenswrapper[4717]: I1007 14:22:04.868898 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:22:04 crc kubenswrapper[4717]: E1007 14:22:04.869467 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:22:11 crc kubenswrapper[4717]: I1007 14:22:11.030576 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-z2524"] Oct 07 14:22:11 crc kubenswrapper[4717]: I1007 14:22:11.042855 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-z2524"] Oct 07 14:22:12 crc kubenswrapper[4717]: I1007 14:22:12.879858 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3" path="/var/lib/kubelet/pods/de7c8723-1ef0-4d29-b4ba-97b8dc35e7e3/volumes" Oct 07 14:22:15 crc kubenswrapper[4717]: I1007 14:22:15.399726 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-9896cd659-vvdxn" podUID="f44baabf-f1c4-4036-8c6b-ce32cc6cf541" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 07 14:22:19 crc kubenswrapper[4717]: I1007 14:22:19.869350 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:22:19 crc kubenswrapper[4717]: E1007 14:22:19.871348 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:22:30 crc kubenswrapper[4717]: I1007 14:22:30.868133 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:22:30 crc kubenswrapper[4717]: E1007 14:22:30.868839 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:22:42 crc kubenswrapper[4717]: I1007 14:22:42.668261 4717 scope.go:117] "RemoveContainer" containerID="0a8050a8cd4d044ad4d231288b090a4cfc07e20d794bdee8a68820d57518ba75" Oct 07 14:22:42 crc kubenswrapper[4717]: I1007 14:22:42.692607 4717 scope.go:117] "RemoveContainer" containerID="802349aec5f093427e75c76200160d57a5a7989e456b9f437e17268d078c350a" Oct 07 14:22:42 crc kubenswrapper[4717]: I1007 14:22:42.745603 4717 scope.go:117] "RemoveContainer" containerID="a03122b0c53e442dfa9ea67095f432f82a90d23d95cc4352ee506aa1cd33b724" Oct 07 14:22:42 crc kubenswrapper[4717]: I1007 14:22:42.786959 4717 scope.go:117] "RemoveContainer" containerID="b6faa65930282b1c869049b210335050b683b6225fb3aba45b13724f327ae3cb" Oct 07 14:22:42 crc kubenswrapper[4717]: I1007 14:22:42.850159 4717 scope.go:117] "RemoveContainer" containerID="8094c3c8c4396c5d8ee7febbf5a60ef36df9e85e71eefeefacc7b025a4c3860d" Oct 07 14:22:44 crc kubenswrapper[4717]: I1007 14:22:44.054636 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-qmft2"] Oct 07 14:22:44 crc kubenswrapper[4717]: I1007 14:22:44.062082 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-qmft2"] Oct 07 14:22:44 crc kubenswrapper[4717]: I1007 14:22:44.869201 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:22:44 crc kubenswrapper[4717]: E1007 14:22:44.869479 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:22:44 crc kubenswrapper[4717]: I1007 14:22:44.879100 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb1a5ed9-e123-447d-a56d-e0cce35eb56a" path="/var/lib/kubelet/pods/eb1a5ed9-e123-447d-a56d-e0cce35eb56a/volumes" Oct 07 14:22:56 crc kubenswrapper[4717]: I1007 14:22:56.868934 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:22:56 crc kubenswrapper[4717]: E1007 14:22:56.869666 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:23:04 crc kubenswrapper[4717]: I1007 14:23:04.045614 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-tjhjz"] Oct 07 14:23:04 crc kubenswrapper[4717]: I1007 14:23:04.058285 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-tjhjz"] Oct 07 14:23:04 crc kubenswrapper[4717]: I1007 14:23:04.881507 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd33a1d-9592-465a-8285-941a03e92fa4" path="/var/lib/kubelet/pods/8dd33a1d-9592-465a-8285-941a03e92fa4/volumes" Oct 07 14:23:06 crc kubenswrapper[4717]: I1007 14:23:06.032079 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gch6l"] Oct 07 14:23:06 crc kubenswrapper[4717]: I1007 14:23:06.040356 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gch6l"] Oct 07 14:23:06 crc kubenswrapper[4717]: I1007 14:23:06.878957 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="339f6768-36c5-4856-9159-29573ce25fe8" path="/var/lib/kubelet/pods/339f6768-36c5-4856-9159-29573ce25fe8/volumes" Oct 07 14:23:08 crc kubenswrapper[4717]: I1007 14:23:08.025772 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8w5dj"] Oct 07 14:23:08 crc kubenswrapper[4717]: I1007 14:23:08.032214 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8w5dj"] Oct 07 14:23:08 crc kubenswrapper[4717]: I1007 14:23:08.884082 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb6f9762-a7d3-48ee-97ce-57439f4ee323" path="/var/lib/kubelet/pods/cb6f9762-a7d3-48ee-97ce-57439f4ee323/volumes" Oct 07 14:23:11 crc kubenswrapper[4717]: I1007 14:23:11.868542 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:23:11 crc kubenswrapper[4717]: E1007 14:23:11.869274 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:23:21 crc kubenswrapper[4717]: I1007 14:23:21.352320 4717 generic.go:334] "Generic (PLEG): container finished" podID="769125a8-8870-4ddf-86e3-cb1bfa198b41" containerID="dce8924f1cc075007e6b926fa847518a840c4e22b957c65c25bcdb2403955ed8" exitCode=0 Oct 07 14:23:21 crc kubenswrapper[4717]: I1007 14:23:21.352357 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" event={"ID":"769125a8-8870-4ddf-86e3-cb1bfa198b41","Type":"ContainerDied","Data":"dce8924f1cc075007e6b926fa847518a840c4e22b957c65c25bcdb2403955ed8"} Oct 07 14:23:22 crc kubenswrapper[4717]: I1007 14:23:22.776628 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:23:22 crc kubenswrapper[4717]: I1007 14:23:22.868705 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:23:22 crc kubenswrapper[4717]: E1007 14:23:22.869129 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:23:22 crc kubenswrapper[4717]: I1007 14:23:22.907722 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-ssh-key\") pod \"769125a8-8870-4ddf-86e3-cb1bfa198b41\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " Oct 07 14:23:22 crc kubenswrapper[4717]: I1007 14:23:22.907822 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9hjx\" (UniqueName: \"kubernetes.io/projected/769125a8-8870-4ddf-86e3-cb1bfa198b41-kube-api-access-k9hjx\") pod \"769125a8-8870-4ddf-86e3-cb1bfa198b41\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " Oct 07 14:23:22 crc kubenswrapper[4717]: I1007 14:23:22.908035 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-inventory\") pod \"769125a8-8870-4ddf-86e3-cb1bfa198b41\" (UID: \"769125a8-8870-4ddf-86e3-cb1bfa198b41\") " Oct 07 14:23:22 crc kubenswrapper[4717]: I1007 14:23:22.913204 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769125a8-8870-4ddf-86e3-cb1bfa198b41-kube-api-access-k9hjx" (OuterVolumeSpecName: "kube-api-access-k9hjx") pod "769125a8-8870-4ddf-86e3-cb1bfa198b41" (UID: "769125a8-8870-4ddf-86e3-cb1bfa198b41"). InnerVolumeSpecName "kube-api-access-k9hjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:23:22 crc kubenswrapper[4717]: I1007 14:23:22.937724 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-inventory" (OuterVolumeSpecName: "inventory") pod "769125a8-8870-4ddf-86e3-cb1bfa198b41" (UID: "769125a8-8870-4ddf-86e3-cb1bfa198b41"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:23:22 crc kubenswrapper[4717]: I1007 14:23:22.938922 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "769125a8-8870-4ddf-86e3-cb1bfa198b41" (UID: "769125a8-8870-4ddf-86e3-cb1bfa198b41"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.012113 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.012358 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/769125a8-8870-4ddf-86e3-cb1bfa198b41-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.012370 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9hjx\" (UniqueName: \"kubernetes.io/projected/769125a8-8870-4ddf-86e3-cb1bfa198b41-kube-api-access-k9hjx\") on node \"crc\" DevicePath \"\"" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.371350 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" event={"ID":"769125a8-8870-4ddf-86e3-cb1bfa198b41","Type":"ContainerDied","Data":"6ae367bca32578bc524520cc26e2c17cf8818a725f9ba252f7eda35d655fe06e"} Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.371389 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ae367bca32578bc524520cc26e2c17cf8818a725f9ba252f7eda35d655fe06e" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.371406 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.455771 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5"] Oct 07 14:23:23 crc kubenswrapper[4717]: E1007 14:23:23.456303 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769125a8-8870-4ddf-86e3-cb1bfa198b41" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.456321 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="769125a8-8870-4ddf-86e3-cb1bfa198b41" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.456550 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="769125a8-8870-4ddf-86e3-cb1bfa198b41" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.457346 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.459266 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.459495 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.460115 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.465601 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.466221 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5"] Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.624160 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrxkw\" (UniqueName: \"kubernetes.io/projected/c4007561-6cfe-400e-81b9-d60b36d79171-kube-api-access-nrxkw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.624228 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.624310 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.726475 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.726627 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.726729 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrxkw\" (UniqueName: \"kubernetes.io/projected/c4007561-6cfe-400e-81b9-d60b36d79171-kube-api-access-nrxkw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.730451 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.730778 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.743564 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrxkw\" (UniqueName: \"kubernetes.io/projected/c4007561-6cfe-400e-81b9-d60b36d79171-kube-api-access-nrxkw\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:23 crc kubenswrapper[4717]: I1007 14:23:23.779046 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:23:24 crc kubenswrapper[4717]: I1007 14:23:24.273428 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5"] Oct 07 14:23:24 crc kubenswrapper[4717]: I1007 14:23:24.394426 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" event={"ID":"c4007561-6cfe-400e-81b9-d60b36d79171","Type":"ContainerStarted","Data":"239cc0280620b3f930f2075d08d9e188d0cc960a00788889c9bbd0f66f671f69"} Oct 07 14:23:25 crc kubenswrapper[4717]: I1007 14:23:25.405429 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" event={"ID":"c4007561-6cfe-400e-81b9-d60b36d79171","Type":"ContainerStarted","Data":"fb68d820234c1346331c596d33b8cc18cf9ca3d698afebbe538a8a1014312de6"} Oct 07 14:23:25 crc kubenswrapper[4717]: I1007 14:23:25.419169 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" podStartSLOduration=1.932549037 podStartE2EDuration="2.419150855s" podCreationTimestamp="2025-10-07 14:23:23 +0000 UTC" firstStartedPulling="2025-10-07 14:23:24.274463694 +0000 UTC m=+1786.102389486" lastFinishedPulling="2025-10-07 14:23:24.761065502 +0000 UTC m=+1786.588991304" observedRunningTime="2025-10-07 14:23:25.417852909 +0000 UTC m=+1787.245778701" watchObservedRunningTime="2025-10-07 14:23:25.419150855 +0000 UTC m=+1787.247076647" Oct 07 14:23:34 crc kubenswrapper[4717]: I1007 14:23:34.868077 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:23:34 crc kubenswrapper[4717]: E1007 14:23:34.868912 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:23:35 crc kubenswrapper[4717]: I1007 14:23:35.036987 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-fdlxw"] Oct 07 14:23:35 crc kubenswrapper[4717]: I1007 14:23:35.047050 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-fdlxw"] Oct 07 14:23:36 crc kubenswrapper[4717]: I1007 14:23:36.893067 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1703fd90-e328-4f67-850f-38f8663dd2c2" path="/var/lib/kubelet/pods/1703fd90-e328-4f67-850f-38f8663dd2c2/volumes" Oct 07 14:23:42 crc kubenswrapper[4717]: I1007 14:23:42.969337 4717 scope.go:117] "RemoveContainer" containerID="83bd414aff4dec2a36f523bbb9ab67ac041ffb1a60090ae72f3bac9be7cbec45" Oct 07 14:23:43 crc kubenswrapper[4717]: I1007 14:23:43.010960 4717 scope.go:117] "RemoveContainer" containerID="2f705c4c414143c30e180bd19bb50124e635fb19a70b198027dcac8cea7a78f2" Oct 07 14:23:43 crc kubenswrapper[4717]: I1007 14:23:43.085315 4717 scope.go:117] "RemoveContainer" containerID="4e9d004ce614d6e73bd50f6a9aabd4d649b982cdde84bcad0d197112347c667a" Oct 07 14:23:43 crc kubenswrapper[4717]: I1007 14:23:43.125724 4717 scope.go:117] "RemoveContainer" containerID="729507490ead88b8332b0b90d09b340fc07490c4ed508bd1cf56e705bfc50819" Oct 07 14:23:43 crc kubenswrapper[4717]: I1007 14:23:43.162682 4717 scope.go:117] "RemoveContainer" containerID="b93e480bbb5018f3ccdd3bd336d0576e5c6b868c33875b4ba82588ae48cff50f" Oct 07 14:23:45 crc kubenswrapper[4717]: I1007 14:23:45.869528 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:23:45 crc kubenswrapper[4717]: E1007 14:23:45.870092 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:23:49 crc kubenswrapper[4717]: I1007 14:23:49.044833 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zlz22"] Oct 07 14:23:49 crc kubenswrapper[4717]: I1007 14:23:49.057063 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jrb7v"] Oct 07 14:23:49 crc kubenswrapper[4717]: I1007 14:23:49.065537 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-bgz59"] Oct 07 14:23:49 crc kubenswrapper[4717]: I1007 14:23:49.075019 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zlz22"] Oct 07 14:23:49 crc kubenswrapper[4717]: I1007 14:23:49.091577 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jrb7v"] Oct 07 14:23:49 crc kubenswrapper[4717]: I1007 14:23:49.098803 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-bgz59"] Oct 07 14:23:50 crc kubenswrapper[4717]: I1007 14:23:50.879982 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0061a317-01ad-4690-a87d-8e2e6f6f3344" path="/var/lib/kubelet/pods/0061a317-01ad-4690-a87d-8e2e6f6f3344/volumes" Oct 07 14:23:50 crc kubenswrapper[4717]: I1007 14:23:50.881224 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3576d729-52a6-41c4-b070-64f86f9bc55b" path="/var/lib/kubelet/pods/3576d729-52a6-41c4-b070-64f86f9bc55b/volumes" Oct 07 14:23:50 crc kubenswrapper[4717]: I1007 14:23:50.882167 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29c601f-2095-4f1f-baf8-8c79792118dd" path="/var/lib/kubelet/pods/f29c601f-2095-4f1f-baf8-8c79792118dd/volumes" Oct 07 14:24:00 crc kubenswrapper[4717]: I1007 14:24:00.868344 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:24:00 crc kubenswrapper[4717]: E1007 14:24:00.869070 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:24:05 crc kubenswrapper[4717]: I1007 14:24:05.043744 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-6c8b-account-create-7rnnt"] Oct 07 14:24:05 crc kubenswrapper[4717]: I1007 14:24:05.051422 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-6c8b-account-create-7rnnt"] Oct 07 14:24:06 crc kubenswrapper[4717]: I1007 14:24:06.029980 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-9fe5-account-create-4vbdk"] Oct 07 14:24:06 crc kubenswrapper[4717]: I1007 14:24:06.039431 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c898-account-create-ch4ht"] Oct 07 14:24:06 crc kubenswrapper[4717]: I1007 14:24:06.047105 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c898-account-create-ch4ht"] Oct 07 14:24:06 crc kubenswrapper[4717]: I1007 14:24:06.053766 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-9fe5-account-create-4vbdk"] Oct 07 14:24:06 crc kubenswrapper[4717]: I1007 14:24:06.880239 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46139707-cb57-4ea6-8146-624e1bc2e42f" path="/var/lib/kubelet/pods/46139707-cb57-4ea6-8146-624e1bc2e42f/volumes" Oct 07 14:24:06 crc kubenswrapper[4717]: I1007 14:24:06.880809 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940dabf8-ee08-4071-bf12-296bae9464b7" path="/var/lib/kubelet/pods/940dabf8-ee08-4071-bf12-296bae9464b7/volumes" Oct 07 14:24:06 crc kubenswrapper[4717]: I1007 14:24:06.881396 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf7cad25-0861-422f-8ac4-5323c48f28fa" path="/var/lib/kubelet/pods/bf7cad25-0861-422f-8ac4-5323c48f28fa/volumes" Oct 07 14:24:13 crc kubenswrapper[4717]: I1007 14:24:13.868483 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:24:14 crc kubenswrapper[4717]: I1007 14:24:14.849146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"e96337bcb79192f8c77c0f06850a719e98effc004e5bae83edcb61766e65fad7"} Oct 07 14:24:33 crc kubenswrapper[4717]: I1007 14:24:33.041174 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j9cmq"] Oct 07 14:24:33 crc kubenswrapper[4717]: I1007 14:24:33.048599 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-j9cmq"] Oct 07 14:24:34 crc kubenswrapper[4717]: I1007 14:24:34.878863 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55d6263-cc3d-41fc-8024-06ca7612fece" path="/var/lib/kubelet/pods/c55d6263-cc3d-41fc-8024-06ca7612fece/volumes" Oct 07 14:24:38 crc kubenswrapper[4717]: I1007 14:24:38.042835 4717 generic.go:334] "Generic (PLEG): container finished" podID="c4007561-6cfe-400e-81b9-d60b36d79171" containerID="fb68d820234c1346331c596d33b8cc18cf9ca3d698afebbe538a8a1014312de6" exitCode=0 Oct 07 14:24:38 crc kubenswrapper[4717]: I1007 14:24:38.042954 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" event={"ID":"c4007561-6cfe-400e-81b9-d60b36d79171","Type":"ContainerDied","Data":"fb68d820234c1346331c596d33b8cc18cf9ca3d698afebbe538a8a1014312de6"} Oct 07 14:24:39 crc kubenswrapper[4717]: I1007 14:24:39.475345 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:24:39 crc kubenswrapper[4717]: I1007 14:24:39.594258 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-inventory\") pod \"c4007561-6cfe-400e-81b9-d60b36d79171\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " Oct 07 14:24:39 crc kubenswrapper[4717]: I1007 14:24:39.594348 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrxkw\" (UniqueName: \"kubernetes.io/projected/c4007561-6cfe-400e-81b9-d60b36d79171-kube-api-access-nrxkw\") pod \"c4007561-6cfe-400e-81b9-d60b36d79171\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " Oct 07 14:24:39 crc kubenswrapper[4717]: I1007 14:24:39.594645 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-ssh-key\") pod \"c4007561-6cfe-400e-81b9-d60b36d79171\" (UID: \"c4007561-6cfe-400e-81b9-d60b36d79171\") " Oct 07 14:24:39 crc kubenswrapper[4717]: I1007 14:24:39.599545 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4007561-6cfe-400e-81b9-d60b36d79171-kube-api-access-nrxkw" (OuterVolumeSpecName: "kube-api-access-nrxkw") pod "c4007561-6cfe-400e-81b9-d60b36d79171" (UID: "c4007561-6cfe-400e-81b9-d60b36d79171"). InnerVolumeSpecName "kube-api-access-nrxkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:24:39 crc kubenswrapper[4717]: I1007 14:24:39.621688 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-inventory" (OuterVolumeSpecName: "inventory") pod "c4007561-6cfe-400e-81b9-d60b36d79171" (UID: "c4007561-6cfe-400e-81b9-d60b36d79171"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:24:39 crc kubenswrapper[4717]: I1007 14:24:39.622397 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c4007561-6cfe-400e-81b9-d60b36d79171" (UID: "c4007561-6cfe-400e-81b9-d60b36d79171"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:24:39 crc kubenswrapper[4717]: I1007 14:24:39.697480 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrxkw\" (UniqueName: \"kubernetes.io/projected/c4007561-6cfe-400e-81b9-d60b36d79171-kube-api-access-nrxkw\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:39 crc kubenswrapper[4717]: I1007 14:24:39.697515 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:39 crc kubenswrapper[4717]: I1007 14:24:39.697527 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4007561-6cfe-400e-81b9-d60b36d79171-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.063033 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" event={"ID":"c4007561-6cfe-400e-81b9-d60b36d79171","Type":"ContainerDied","Data":"239cc0280620b3f930f2075d08d9e188d0cc960a00788889c9bbd0f66f671f69"} Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.063388 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="239cc0280620b3f930f2075d08d9e188d0cc960a00788889c9bbd0f66f671f69" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.063134 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.150438 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2"] Oct 07 14:24:40 crc kubenswrapper[4717]: E1007 14:24:40.150904 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4007561-6cfe-400e-81b9-d60b36d79171" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.150928 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4007561-6cfe-400e-81b9-d60b36d79171" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.151179 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4007561-6cfe-400e-81b9-d60b36d79171" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.151897 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.154066 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.154909 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.155068 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.155374 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.162788 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2"] Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.308560 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48ptr\" (UniqueName: \"kubernetes.io/projected/7499d2ef-057a-4267-9725-bb62675d9eb8-kube-api-access-48ptr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zthb2\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.308674 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zthb2\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.308780 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zthb2\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.409942 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zthb2\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.410073 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zthb2\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.410133 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48ptr\" (UniqueName: \"kubernetes.io/projected/7499d2ef-057a-4267-9725-bb62675d9eb8-kube-api-access-48ptr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zthb2\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.414383 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zthb2\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.416662 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zthb2\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.427407 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48ptr\" (UniqueName: \"kubernetes.io/projected/7499d2ef-057a-4267-9725-bb62675d9eb8-kube-api-access-48ptr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-zthb2\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.481891 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:40 crc kubenswrapper[4717]: I1007 14:24:40.971428 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2"] Oct 07 14:24:41 crc kubenswrapper[4717]: I1007 14:24:41.072463 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" event={"ID":"7499d2ef-057a-4267-9725-bb62675d9eb8","Type":"ContainerStarted","Data":"944163214e48cd65e8b1749d32bbb10d17941b2cd281c6b4243fc3dea23e7025"} Oct 07 14:24:42 crc kubenswrapper[4717]: I1007 14:24:42.088552 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" event={"ID":"7499d2ef-057a-4267-9725-bb62675d9eb8","Type":"ContainerStarted","Data":"a81e4d0b09c36b3efc31f79de380e2d52e5ad6125c872bcf45ed499edb04c09a"} Oct 07 14:24:42 crc kubenswrapper[4717]: I1007 14:24:42.110384 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" podStartSLOduration=1.586460671 podStartE2EDuration="2.110363027s" podCreationTimestamp="2025-10-07 14:24:40 +0000 UTC" firstStartedPulling="2025-10-07 14:24:40.972772432 +0000 UTC m=+1862.800698224" lastFinishedPulling="2025-10-07 14:24:41.496674788 +0000 UTC m=+1863.324600580" observedRunningTime="2025-10-07 14:24:42.106710606 +0000 UTC m=+1863.934636388" watchObservedRunningTime="2025-10-07 14:24:42.110363027 +0000 UTC m=+1863.938288819" Oct 07 14:24:43 crc kubenswrapper[4717]: I1007 14:24:43.311529 4717 scope.go:117] "RemoveContainer" containerID="a8b4ce98facd374334e6447a5f20df51dc3aa1749c78a336636e048afd923735" Oct 07 14:24:43 crc kubenswrapper[4717]: I1007 14:24:43.350378 4717 scope.go:117] "RemoveContainer" containerID="92954b2eeace8644b17de546a6647c2c824fbff3f530d9da7e69859cce182bd0" Oct 07 14:24:43 crc kubenswrapper[4717]: I1007 14:24:43.381736 4717 scope.go:117] "RemoveContainer" containerID="67e18d287c511e4cb04b6b5aaf9a50b715fadf055154978c75eefd8a4b0affe5" Oct 07 14:24:43 crc kubenswrapper[4717]: I1007 14:24:43.441528 4717 scope.go:117] "RemoveContainer" containerID="754d72d0b9ce03a416452fec8eb9d32ff82f3b5cfdfff5ae909f39ac5e65b7a2" Oct 07 14:24:43 crc kubenswrapper[4717]: I1007 14:24:43.476314 4717 scope.go:117] "RemoveContainer" containerID="0e78793318db11adec9b8d99ee0cf901588375e520aa6f4d374137c06585184c" Oct 07 14:24:43 crc kubenswrapper[4717]: I1007 14:24:43.529494 4717 scope.go:117] "RemoveContainer" containerID="6ae81542ac151e06e3eb9cf5a33e8094b274215f7751d1bd9a51277ec1b63d01" Oct 07 14:24:43 crc kubenswrapper[4717]: I1007 14:24:43.583528 4717 scope.go:117] "RemoveContainer" containerID="504e95263a0df8a967d21c8e96b7da41e13d637e5d96669d2d2d78062818cc77" Oct 07 14:24:47 crc kubenswrapper[4717]: I1007 14:24:47.132678 4717 generic.go:334] "Generic (PLEG): container finished" podID="7499d2ef-057a-4267-9725-bb62675d9eb8" containerID="a81e4d0b09c36b3efc31f79de380e2d52e5ad6125c872bcf45ed499edb04c09a" exitCode=0 Oct 07 14:24:47 crc kubenswrapper[4717]: I1007 14:24:47.132761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" event={"ID":"7499d2ef-057a-4267-9725-bb62675d9eb8","Type":"ContainerDied","Data":"a81e4d0b09c36b3efc31f79de380e2d52e5ad6125c872bcf45ed499edb04c09a"} Oct 07 14:24:48 crc kubenswrapper[4717]: I1007 14:24:48.567993 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:48 crc kubenswrapper[4717]: I1007 14:24:48.678916 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-ssh-key\") pod \"7499d2ef-057a-4267-9725-bb62675d9eb8\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " Oct 07 14:24:48 crc kubenswrapper[4717]: I1007 14:24:48.679118 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-inventory\") pod \"7499d2ef-057a-4267-9725-bb62675d9eb8\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " Oct 07 14:24:48 crc kubenswrapper[4717]: I1007 14:24:48.679140 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48ptr\" (UniqueName: \"kubernetes.io/projected/7499d2ef-057a-4267-9725-bb62675d9eb8-kube-api-access-48ptr\") pod \"7499d2ef-057a-4267-9725-bb62675d9eb8\" (UID: \"7499d2ef-057a-4267-9725-bb62675d9eb8\") " Oct 07 14:24:48 crc kubenswrapper[4717]: I1007 14:24:48.687182 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7499d2ef-057a-4267-9725-bb62675d9eb8-kube-api-access-48ptr" (OuterVolumeSpecName: "kube-api-access-48ptr") pod "7499d2ef-057a-4267-9725-bb62675d9eb8" (UID: "7499d2ef-057a-4267-9725-bb62675d9eb8"). InnerVolumeSpecName "kube-api-access-48ptr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:24:48 crc kubenswrapper[4717]: I1007 14:24:48.716297 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7499d2ef-057a-4267-9725-bb62675d9eb8" (UID: "7499d2ef-057a-4267-9725-bb62675d9eb8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:24:48 crc kubenswrapper[4717]: I1007 14:24:48.717315 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-inventory" (OuterVolumeSpecName: "inventory") pod "7499d2ef-057a-4267-9725-bb62675d9eb8" (UID: "7499d2ef-057a-4267-9725-bb62675d9eb8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:24:48 crc kubenswrapper[4717]: I1007 14:24:48.781657 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48ptr\" (UniqueName: \"kubernetes.io/projected/7499d2ef-057a-4267-9725-bb62675d9eb8-kube-api-access-48ptr\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:48 crc kubenswrapper[4717]: I1007 14:24:48.781693 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:48 crc kubenswrapper[4717]: I1007 14:24:48.781702 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7499d2ef-057a-4267-9725-bb62675d9eb8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.157543 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" event={"ID":"7499d2ef-057a-4267-9725-bb62675d9eb8","Type":"ContainerDied","Data":"944163214e48cd65e8b1749d32bbb10d17941b2cd281c6b4243fc3dea23e7025"} Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.157576 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-zthb2" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.157581 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="944163214e48cd65e8b1749d32bbb10d17941b2cd281c6b4243fc3dea23e7025" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.214362 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57"] Oct 07 14:24:49 crc kubenswrapper[4717]: E1007 14:24:49.214783 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7499d2ef-057a-4267-9725-bb62675d9eb8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.214800 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="7499d2ef-057a-4267-9725-bb62675d9eb8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.215090 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="7499d2ef-057a-4267-9725-bb62675d9eb8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.215931 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.222155 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.222667 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.222956 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.223120 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.225413 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57"] Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.393597 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppt2g\" (UniqueName: \"kubernetes.io/projected/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-kube-api-access-ppt2g\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xfj57\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.394085 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xfj57\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.394163 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xfj57\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.496746 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppt2g\" (UniqueName: \"kubernetes.io/projected/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-kube-api-access-ppt2g\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xfj57\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.496817 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xfj57\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.496855 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xfj57\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.504289 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xfj57\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.504765 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xfj57\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.518584 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppt2g\" (UniqueName: \"kubernetes.io/projected/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-kube-api-access-ppt2g\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xfj57\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:49 crc kubenswrapper[4717]: I1007 14:24:49.533759 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:24:50 crc kubenswrapper[4717]: I1007 14:24:50.048233 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57"] Oct 07 14:24:50 crc kubenswrapper[4717]: W1007 14:24:50.061178 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbf34104_46e2_4650_bdd0_f3f8cfb6d590.slice/crio-fadffab2623dee85e72d39045e238d6ebc99d006ce86e3bebb1ebc7979909276 WatchSource:0}: Error finding container fadffab2623dee85e72d39045e238d6ebc99d006ce86e3bebb1ebc7979909276: Status 404 returned error can't find the container with id fadffab2623dee85e72d39045e238d6ebc99d006ce86e3bebb1ebc7979909276 Oct 07 14:24:50 crc kubenswrapper[4717]: I1007 14:24:50.177389 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" event={"ID":"bbf34104-46e2-4650-bdd0-f3f8cfb6d590","Type":"ContainerStarted","Data":"fadffab2623dee85e72d39045e238d6ebc99d006ce86e3bebb1ebc7979909276"} Oct 07 14:24:51 crc kubenswrapper[4717]: I1007 14:24:51.024054 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-s4mqg"] Oct 07 14:24:51 crc kubenswrapper[4717]: I1007 14:24:51.031035 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-s4mqg"] Oct 07 14:24:51 crc kubenswrapper[4717]: I1007 14:24:51.185533 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" event={"ID":"bbf34104-46e2-4650-bdd0-f3f8cfb6d590","Type":"ContainerStarted","Data":"0ac3e25fb81157afe23c67f479064e26a06cd77f011d9b2481728aba5cdbc381"} Oct 07 14:24:51 crc kubenswrapper[4717]: I1007 14:24:51.212497 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" podStartSLOduration=1.71896628 podStartE2EDuration="2.21248026s" podCreationTimestamp="2025-10-07 14:24:49 +0000 UTC" firstStartedPulling="2025-10-07 14:24:50.064893521 +0000 UTC m=+1871.892819313" lastFinishedPulling="2025-10-07 14:24:50.558407501 +0000 UTC m=+1872.386333293" observedRunningTime="2025-10-07 14:24:51.206113655 +0000 UTC m=+1873.034039467" watchObservedRunningTime="2025-10-07 14:24:51.21248026 +0000 UTC m=+1873.040406052" Oct 07 14:24:52 crc kubenswrapper[4717]: I1007 14:24:52.880797 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4d74639-11be-451b-ad1f-56897155fc06" path="/var/lib/kubelet/pods/e4d74639-11be-451b-ad1f-56897155fc06/volumes" Oct 07 14:24:54 crc kubenswrapper[4717]: I1007 14:24:54.038982 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m6bpx"] Oct 07 14:24:54 crc kubenswrapper[4717]: I1007 14:24:54.049778 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-m6bpx"] Oct 07 14:24:54 crc kubenswrapper[4717]: I1007 14:24:54.896595 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48bc633c-b89e-4eff-957b-9f2cd14038bb" path="/var/lib/kubelet/pods/48bc633c-b89e-4eff-957b-9f2cd14038bb/volumes" Oct 07 14:25:30 crc kubenswrapper[4717]: I1007 14:25:30.508134 4717 generic.go:334] "Generic (PLEG): container finished" podID="bbf34104-46e2-4650-bdd0-f3f8cfb6d590" containerID="0ac3e25fb81157afe23c67f479064e26a06cd77f011d9b2481728aba5cdbc381" exitCode=0 Oct 07 14:25:30 crc kubenswrapper[4717]: I1007 14:25:30.508228 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" event={"ID":"bbf34104-46e2-4650-bdd0-f3f8cfb6d590","Type":"ContainerDied","Data":"0ac3e25fb81157afe23c67f479064e26a06cd77f011d9b2481728aba5cdbc381"} Oct 07 14:25:31 crc kubenswrapper[4717]: I1007 14:25:31.928357 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.083212 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-ssh-key\") pod \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.083578 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppt2g\" (UniqueName: \"kubernetes.io/projected/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-kube-api-access-ppt2g\") pod \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.083603 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-inventory\") pod \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\" (UID: \"bbf34104-46e2-4650-bdd0-f3f8cfb6d590\") " Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.089348 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-kube-api-access-ppt2g" (OuterVolumeSpecName: "kube-api-access-ppt2g") pod "bbf34104-46e2-4650-bdd0-f3f8cfb6d590" (UID: "bbf34104-46e2-4650-bdd0-f3f8cfb6d590"). InnerVolumeSpecName "kube-api-access-ppt2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.112786 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bbf34104-46e2-4650-bdd0-f3f8cfb6d590" (UID: "bbf34104-46e2-4650-bdd0-f3f8cfb6d590"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.119404 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-inventory" (OuterVolumeSpecName: "inventory") pod "bbf34104-46e2-4650-bdd0-f3f8cfb6d590" (UID: "bbf34104-46e2-4650-bdd0-f3f8cfb6d590"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.185740 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.185774 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppt2g\" (UniqueName: \"kubernetes.io/projected/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-kube-api-access-ppt2g\") on node \"crc\" DevicePath \"\"" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.185788 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bbf34104-46e2-4650-bdd0-f3f8cfb6d590-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.534968 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" event={"ID":"bbf34104-46e2-4650-bdd0-f3f8cfb6d590","Type":"ContainerDied","Data":"fadffab2623dee85e72d39045e238d6ebc99d006ce86e3bebb1ebc7979909276"} Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.535056 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fadffab2623dee85e72d39045e238d6ebc99d006ce86e3bebb1ebc7979909276" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.535140 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xfj57" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.620845 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv"] Oct 07 14:25:32 crc kubenswrapper[4717]: E1007 14:25:32.621498 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf34104-46e2-4650-bdd0-f3f8cfb6d590" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.621577 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf34104-46e2-4650-bdd0-f3f8cfb6d590" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.621851 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf34104-46e2-4650-bdd0-f3f8cfb6d590" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.622561 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.625554 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.625584 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.625742 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.626290 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.632621 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv"] Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.695791 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dppcv\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.696049 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqzn\" (UniqueName: \"kubernetes.io/projected/b7e5a28d-f380-447f-998f-5e65280d3651-kube-api-access-9pqzn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dppcv\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.696140 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dppcv\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.797445 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dppcv\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.797590 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqzn\" (UniqueName: \"kubernetes.io/projected/b7e5a28d-f380-447f-998f-5e65280d3651-kube-api-access-9pqzn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dppcv\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.797664 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dppcv\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.801667 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dppcv\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.801768 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dppcv\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.818891 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqzn\" (UniqueName: \"kubernetes.io/projected/b7e5a28d-f380-447f-998f-5e65280d3651-kube-api-access-9pqzn\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dppcv\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:32 crc kubenswrapper[4717]: I1007 14:25:32.955202 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:25:33 crc kubenswrapper[4717]: I1007 14:25:33.453632 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv"] Oct 07 14:25:33 crc kubenswrapper[4717]: I1007 14:25:33.543662 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" event={"ID":"b7e5a28d-f380-447f-998f-5e65280d3651","Type":"ContainerStarted","Data":"7b09a5faa7529c6af305879affede0c15f37cc0f90eb7e5a8a7145f7d4243bc5"} Oct 07 14:25:34 crc kubenswrapper[4717]: I1007 14:25:34.553177 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" event={"ID":"b7e5a28d-f380-447f-998f-5e65280d3651","Type":"ContainerStarted","Data":"69979dd18bba4adc3e1df852d051d4b19f235439998b874406d577d9c88cc2ad"} Oct 07 14:25:34 crc kubenswrapper[4717]: I1007 14:25:34.571112 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" podStartSLOduration=2.109474258 podStartE2EDuration="2.571089514s" podCreationTimestamp="2025-10-07 14:25:32 +0000 UTC" firstStartedPulling="2025-10-07 14:25:33.461158494 +0000 UTC m=+1915.289084286" lastFinishedPulling="2025-10-07 14:25:33.92277375 +0000 UTC m=+1915.750699542" observedRunningTime="2025-10-07 14:25:34.566158048 +0000 UTC m=+1916.394083850" watchObservedRunningTime="2025-10-07 14:25:34.571089514 +0000 UTC m=+1916.399015316" Oct 07 14:25:36 crc kubenswrapper[4717]: I1007 14:25:36.038868 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-rj7jp"] Oct 07 14:25:36 crc kubenswrapper[4717]: I1007 14:25:36.046032 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-rj7jp"] Oct 07 14:25:36 crc kubenswrapper[4717]: I1007 14:25:36.878923 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="663a1e17-3611-4da4-8d88-e3f5b3c7d0b6" path="/var/lib/kubelet/pods/663a1e17-3611-4da4-8d88-e3f5b3c7d0b6/volumes" Oct 07 14:25:43 crc kubenswrapper[4717]: I1007 14:25:43.708060 4717 scope.go:117] "RemoveContainer" containerID="69506bce28f7bce0d8a2f673a3affe898d6303fd2dc4ccf5d827f2559359c315" Oct 07 14:25:43 crc kubenswrapper[4717]: I1007 14:25:43.740545 4717 scope.go:117] "RemoveContainer" containerID="54c445c672165da09acda91840f544f06248e79459c9b182bb0ec480968101ed" Oct 07 14:25:43 crc kubenswrapper[4717]: I1007 14:25:43.788972 4717 scope.go:117] "RemoveContainer" containerID="246d2fc2425512ceccdf324e7c83ce6724aec8ab2ee2beea75e1291fcdadefb3" Oct 07 14:26:08 crc kubenswrapper[4717]: I1007 14:26:08.819394 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pfsbb"] Oct 07 14:26:08 crc kubenswrapper[4717]: I1007 14:26:08.822207 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:08 crc kubenswrapper[4717]: I1007 14:26:08.834227 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfsbb"] Oct 07 14:26:08 crc kubenswrapper[4717]: I1007 14:26:08.930837 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbwlk\" (UniqueName: \"kubernetes.io/projected/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-kube-api-access-xbwlk\") pod \"redhat-marketplace-pfsbb\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:08 crc kubenswrapper[4717]: I1007 14:26:08.931155 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-catalog-content\") pod \"redhat-marketplace-pfsbb\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:08 crc kubenswrapper[4717]: I1007 14:26:08.931282 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-utilities\") pod \"redhat-marketplace-pfsbb\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:09 crc kubenswrapper[4717]: I1007 14:26:09.033961 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbwlk\" (UniqueName: \"kubernetes.io/projected/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-kube-api-access-xbwlk\") pod \"redhat-marketplace-pfsbb\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:09 crc kubenswrapper[4717]: I1007 14:26:09.034079 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-catalog-content\") pod \"redhat-marketplace-pfsbb\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:09 crc kubenswrapper[4717]: I1007 14:26:09.034116 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-utilities\") pod \"redhat-marketplace-pfsbb\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:09 crc kubenswrapper[4717]: I1007 14:26:09.034703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-utilities\") pod \"redhat-marketplace-pfsbb\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:09 crc kubenswrapper[4717]: I1007 14:26:09.034700 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-catalog-content\") pod \"redhat-marketplace-pfsbb\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:09 crc kubenswrapper[4717]: I1007 14:26:09.052437 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbwlk\" (UniqueName: \"kubernetes.io/projected/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-kube-api-access-xbwlk\") pod \"redhat-marketplace-pfsbb\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:09 crc kubenswrapper[4717]: I1007 14:26:09.152339 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:09 crc kubenswrapper[4717]: I1007 14:26:09.655658 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfsbb"] Oct 07 14:26:09 crc kubenswrapper[4717]: I1007 14:26:09.840860 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfsbb" event={"ID":"08c52a75-b9f9-4cbf-9df0-3594e4574bc2","Type":"ContainerStarted","Data":"a82d42b54d1132ef5536be66dcbe31f2de13f207d4b2cc6a0013b8babf86cb34"} Oct 07 14:26:10 crc kubenswrapper[4717]: I1007 14:26:10.851517 4717 generic.go:334] "Generic (PLEG): container finished" podID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerID="2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db" exitCode=0 Oct 07 14:26:10 crc kubenswrapper[4717]: I1007 14:26:10.851582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfsbb" event={"ID":"08c52a75-b9f9-4cbf-9df0-3594e4574bc2","Type":"ContainerDied","Data":"2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db"} Oct 07 14:26:11 crc kubenswrapper[4717]: I1007 14:26:11.861873 4717 generic.go:334] "Generic (PLEG): container finished" podID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerID="9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df" exitCode=0 Oct 07 14:26:11 crc kubenswrapper[4717]: I1007 14:26:11.861977 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfsbb" event={"ID":"08c52a75-b9f9-4cbf-9df0-3594e4574bc2","Type":"ContainerDied","Data":"9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df"} Oct 07 14:26:12 crc kubenswrapper[4717]: I1007 14:26:12.888534 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfsbb" event={"ID":"08c52a75-b9f9-4cbf-9df0-3594e4574bc2","Type":"ContainerStarted","Data":"cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e"} Oct 07 14:26:12 crc kubenswrapper[4717]: I1007 14:26:12.905940 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pfsbb" podStartSLOduration=3.501791199 podStartE2EDuration="4.905913725s" podCreationTimestamp="2025-10-07 14:26:08 +0000 UTC" firstStartedPulling="2025-10-07 14:26:10.853996269 +0000 UTC m=+1952.681922061" lastFinishedPulling="2025-10-07 14:26:12.258118795 +0000 UTC m=+1954.086044587" observedRunningTime="2025-10-07 14:26:12.903378236 +0000 UTC m=+1954.731304038" watchObservedRunningTime="2025-10-07 14:26:12.905913725 +0000 UTC m=+1954.733839527" Oct 07 14:26:19 crc kubenswrapper[4717]: I1007 14:26:19.153112 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:19 crc kubenswrapper[4717]: I1007 14:26:19.153631 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:19 crc kubenswrapper[4717]: I1007 14:26:19.194524 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:19 crc kubenswrapper[4717]: I1007 14:26:19.997880 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:20 crc kubenswrapper[4717]: I1007 14:26:20.054568 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfsbb"] Oct 07 14:26:21 crc kubenswrapper[4717]: I1007 14:26:21.957877 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pfsbb" podUID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerName="registry-server" containerID="cri-o://cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e" gracePeriod=2 Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.371406 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.549558 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-utilities\") pod \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.549678 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbwlk\" (UniqueName: \"kubernetes.io/projected/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-kube-api-access-xbwlk\") pod \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.549833 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-catalog-content\") pod \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\" (UID: \"08c52a75-b9f9-4cbf-9df0-3594e4574bc2\") " Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.553443 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-utilities" (OuterVolumeSpecName: "utilities") pod "08c52a75-b9f9-4cbf-9df0-3594e4574bc2" (UID: "08c52a75-b9f9-4cbf-9df0-3594e4574bc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.566185 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08c52a75-b9f9-4cbf-9df0-3594e4574bc2" (UID: "08c52a75-b9f9-4cbf-9df0-3594e4574bc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.573319 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-kube-api-access-xbwlk" (OuterVolumeSpecName: "kube-api-access-xbwlk") pod "08c52a75-b9f9-4cbf-9df0-3594e4574bc2" (UID: "08c52a75-b9f9-4cbf-9df0-3594e4574bc2"). InnerVolumeSpecName "kube-api-access-xbwlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.652685 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.652958 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbwlk\" (UniqueName: \"kubernetes.io/projected/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-kube-api-access-xbwlk\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.652967 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08c52a75-b9f9-4cbf-9df0-3594e4574bc2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.968791 4717 generic.go:334] "Generic (PLEG): container finished" podID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerID="cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e" exitCode=0 Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.968832 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfsbb" event={"ID":"08c52a75-b9f9-4cbf-9df0-3594e4574bc2","Type":"ContainerDied","Data":"cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e"} Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.968857 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pfsbb" event={"ID":"08c52a75-b9f9-4cbf-9df0-3594e4574bc2","Type":"ContainerDied","Data":"a82d42b54d1132ef5536be66dcbe31f2de13f207d4b2cc6a0013b8babf86cb34"} Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.968873 4717 scope.go:117] "RemoveContainer" containerID="cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e" Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.968869 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pfsbb" Oct 07 14:26:22 crc kubenswrapper[4717]: I1007 14:26:22.993248 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfsbb"] Oct 07 14:26:23 crc kubenswrapper[4717]: I1007 14:26:23.005225 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pfsbb"] Oct 07 14:26:23 crc kubenswrapper[4717]: I1007 14:26:23.006650 4717 scope.go:117] "RemoveContainer" containerID="9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df" Oct 07 14:26:23 crc kubenswrapper[4717]: I1007 14:26:23.039663 4717 scope.go:117] "RemoveContainer" containerID="2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db" Oct 07 14:26:23 crc kubenswrapper[4717]: I1007 14:26:23.071754 4717 scope.go:117] "RemoveContainer" containerID="cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e" Oct 07 14:26:23 crc kubenswrapper[4717]: E1007 14:26:23.072214 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e\": container with ID starting with cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e not found: ID does not exist" containerID="cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e" Oct 07 14:26:23 crc kubenswrapper[4717]: I1007 14:26:23.072271 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e"} err="failed to get container status \"cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e\": rpc error: code = NotFound desc = could not find container \"cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e\": container with ID starting with cb502e50f38d10fda3bbb127edd9076347d9b5a7bd4e072fc0887f19ca27545e not found: ID does not exist" Oct 07 14:26:23 crc kubenswrapper[4717]: I1007 14:26:23.072304 4717 scope.go:117] "RemoveContainer" containerID="9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df" Oct 07 14:26:23 crc kubenswrapper[4717]: E1007 14:26:23.072709 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df\": container with ID starting with 9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df not found: ID does not exist" containerID="9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df" Oct 07 14:26:23 crc kubenswrapper[4717]: I1007 14:26:23.072749 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df"} err="failed to get container status \"9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df\": rpc error: code = NotFound desc = could not find container \"9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df\": container with ID starting with 9d5b4f9849f8cea9dbbce0430524f3a1d8df4ddee7dfe3101146c7fa0a2560df not found: ID does not exist" Oct 07 14:26:23 crc kubenswrapper[4717]: I1007 14:26:23.072778 4717 scope.go:117] "RemoveContainer" containerID="2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db" Oct 07 14:26:23 crc kubenswrapper[4717]: E1007 14:26:23.073050 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db\": container with ID starting with 2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db not found: ID does not exist" containerID="2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db" Oct 07 14:26:23 crc kubenswrapper[4717]: I1007 14:26:23.073153 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db"} err="failed to get container status \"2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db\": rpc error: code = NotFound desc = could not find container \"2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db\": container with ID starting with 2e7c2a6bd6d760bc5babcd005c955c48341e5097fc48d646bc58d306dbc927db not found: ID does not exist" Oct 07 14:26:24 crc kubenswrapper[4717]: I1007 14:26:24.878888 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" path="/var/lib/kubelet/pods/08c52a75-b9f9-4cbf-9df0-3594e4574bc2/volumes" Oct 07 14:26:31 crc kubenswrapper[4717]: I1007 14:26:31.032820 4717 generic.go:334] "Generic (PLEG): container finished" podID="b7e5a28d-f380-447f-998f-5e65280d3651" containerID="69979dd18bba4adc3e1df852d051d4b19f235439998b874406d577d9c88cc2ad" exitCode=2 Oct 07 14:26:31 crc kubenswrapper[4717]: I1007 14:26:31.033508 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" event={"ID":"b7e5a28d-f380-447f-998f-5e65280d3651","Type":"ContainerDied","Data":"69979dd18bba4adc3e1df852d051d4b19f235439998b874406d577d9c88cc2ad"} Oct 07 14:26:31 crc kubenswrapper[4717]: I1007 14:26:31.610312 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:26:31 crc kubenswrapper[4717]: I1007 14:26:31.610591 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:26:32 crc kubenswrapper[4717]: I1007 14:26:32.437324 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:26:32 crc kubenswrapper[4717]: I1007 14:26:32.480932 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-inventory\") pod \"b7e5a28d-f380-447f-998f-5e65280d3651\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " Oct 07 14:26:32 crc kubenswrapper[4717]: I1007 14:26:32.481040 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pqzn\" (UniqueName: \"kubernetes.io/projected/b7e5a28d-f380-447f-998f-5e65280d3651-kube-api-access-9pqzn\") pod \"b7e5a28d-f380-447f-998f-5e65280d3651\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " Oct 07 14:26:32 crc kubenswrapper[4717]: I1007 14:26:32.481132 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-ssh-key\") pod \"b7e5a28d-f380-447f-998f-5e65280d3651\" (UID: \"b7e5a28d-f380-447f-998f-5e65280d3651\") " Oct 07 14:26:32 crc kubenswrapper[4717]: I1007 14:26:32.486746 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7e5a28d-f380-447f-998f-5e65280d3651-kube-api-access-9pqzn" (OuterVolumeSpecName: "kube-api-access-9pqzn") pod "b7e5a28d-f380-447f-998f-5e65280d3651" (UID: "b7e5a28d-f380-447f-998f-5e65280d3651"). InnerVolumeSpecName "kube-api-access-9pqzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:26:32 crc kubenswrapper[4717]: I1007 14:26:32.512431 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b7e5a28d-f380-447f-998f-5e65280d3651" (UID: "b7e5a28d-f380-447f-998f-5e65280d3651"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:26:32 crc kubenswrapper[4717]: I1007 14:26:32.518544 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-inventory" (OuterVolumeSpecName: "inventory") pod "b7e5a28d-f380-447f-998f-5e65280d3651" (UID: "b7e5a28d-f380-447f-998f-5e65280d3651"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:26:32 crc kubenswrapper[4717]: I1007 14:26:32.583405 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:32 crc kubenswrapper[4717]: I1007 14:26:32.583442 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7e5a28d-f380-447f-998f-5e65280d3651-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:32 crc kubenswrapper[4717]: I1007 14:26:32.583457 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pqzn\" (UniqueName: \"kubernetes.io/projected/b7e5a28d-f380-447f-998f-5e65280d3651-kube-api-access-9pqzn\") on node \"crc\" DevicePath \"\"" Oct 07 14:26:33 crc kubenswrapper[4717]: I1007 14:26:33.051136 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" event={"ID":"b7e5a28d-f380-447f-998f-5e65280d3651","Type":"ContainerDied","Data":"7b09a5faa7529c6af305879affede0c15f37cc0f90eb7e5a8a7145f7d4243bc5"} Oct 07 14:26:33 crc kubenswrapper[4717]: I1007 14:26:33.051174 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b09a5faa7529c6af305879affede0c15f37cc0f90eb7e5a8a7145f7d4243bc5" Oct 07 14:26:33 crc kubenswrapper[4717]: I1007 14:26:33.051203 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dppcv" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.031038 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6"] Oct 07 14:26:40 crc kubenswrapper[4717]: E1007 14:26:40.032242 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerName="extract-utilities" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.032263 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerName="extract-utilities" Oct 07 14:26:40 crc kubenswrapper[4717]: E1007 14:26:40.032279 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerName="extract-content" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.032287 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerName="extract-content" Oct 07 14:26:40 crc kubenswrapper[4717]: E1007 14:26:40.032316 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerName="registry-server" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.032324 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerName="registry-server" Oct 07 14:26:40 crc kubenswrapper[4717]: E1007 14:26:40.032346 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7e5a28d-f380-447f-998f-5e65280d3651" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.032356 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7e5a28d-f380-447f-998f-5e65280d3651" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.032599 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c52a75-b9f9-4cbf-9df0-3594e4574bc2" containerName="registry-server" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.032617 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7e5a28d-f380-447f-998f-5e65280d3651" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.033475 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.035635 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.035635 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.037486 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.037662 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.049451 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6"] Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.221340 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.221521 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.221638 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc88f\" (UniqueName: \"kubernetes.io/projected/27f71967-9d43-4b11-a286-1544c15adc41-kube-api-access-zc88f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.323029 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.323212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.323334 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc88f\" (UniqueName: \"kubernetes.io/projected/27f71967-9d43-4b11-a286-1544c15adc41-kube-api-access-zc88f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.329181 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.329327 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.343878 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc88f\" (UniqueName: \"kubernetes.io/projected/27f71967-9d43-4b11-a286-1544c15adc41-kube-api-access-zc88f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.355706 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:26:40 crc kubenswrapper[4717]: I1007 14:26:40.863818 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6"] Oct 07 14:26:41 crc kubenswrapper[4717]: I1007 14:26:41.120049 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" event={"ID":"27f71967-9d43-4b11-a286-1544c15adc41","Type":"ContainerStarted","Data":"dd2157ec33772d06a1c4e57df0ca16316dfd4087aca39e08c376b4305bfda5bc"} Oct 07 14:26:42 crc kubenswrapper[4717]: I1007 14:26:42.131169 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" event={"ID":"27f71967-9d43-4b11-a286-1544c15adc41","Type":"ContainerStarted","Data":"a8dd62a88d680f666a7bdbfaff3c45ce34ae7823d22f32e627ad5cea1f89fd64"} Oct 07 14:26:42 crc kubenswrapper[4717]: I1007 14:26:42.149480 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" podStartSLOduration=1.466677615 podStartE2EDuration="2.149448997s" podCreationTimestamp="2025-10-07 14:26:40 +0000 UTC" firstStartedPulling="2025-10-07 14:26:40.863428 +0000 UTC m=+1982.691353792" lastFinishedPulling="2025-10-07 14:26:41.546199392 +0000 UTC m=+1983.374125174" observedRunningTime="2025-10-07 14:26:42.146532917 +0000 UTC m=+1983.974458709" watchObservedRunningTime="2025-10-07 14:26:42.149448997 +0000 UTC m=+1983.977374789" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.428990 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5dsf7"] Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.431624 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.443158 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5dsf7"] Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.563880 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2-catalog-content\") pod \"redhat-operators-5dsf7\" (UID: \"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2\") " pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.564330 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2-utilities\") pod \"redhat-operators-5dsf7\" (UID: \"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2\") " pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.564364 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t226w\" (UniqueName: \"kubernetes.io/projected/4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2-kube-api-access-t226w\") pod \"redhat-operators-5dsf7\" (UID: \"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2\") " pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.666571 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2-catalog-content\") pod \"redhat-operators-5dsf7\" (UID: \"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2\") " pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.666641 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2-utilities\") pod \"redhat-operators-5dsf7\" (UID: \"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2\") " pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.666661 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t226w\" (UniqueName: \"kubernetes.io/projected/4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2-kube-api-access-t226w\") pod \"redhat-operators-5dsf7\" (UID: \"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2\") " pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.667633 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2-utilities\") pod \"redhat-operators-5dsf7\" (UID: \"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2\") " pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.667655 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2-catalog-content\") pod \"redhat-operators-5dsf7\" (UID: \"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2\") " pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.684897 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t226w\" (UniqueName: \"kubernetes.io/projected/4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2-kube-api-access-t226w\") pod \"redhat-operators-5dsf7\" (UID: \"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2\") " pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:52 crc kubenswrapper[4717]: I1007 14:26:52.756799 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:26:53 crc kubenswrapper[4717]: I1007 14:26:53.188971 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5dsf7"] Oct 07 14:26:53 crc kubenswrapper[4717]: I1007 14:26:53.229725 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dsf7" event={"ID":"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2","Type":"ContainerStarted","Data":"719d0e793b16ebb5631f75e0a0d3a072c476644c7668e2bac90fa8b9617d05fd"} Oct 07 14:26:54 crc kubenswrapper[4717]: I1007 14:26:54.239854 4717 generic.go:334] "Generic (PLEG): container finished" podID="4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2" containerID="7134b34d492e37bb58c2f39db6927051b0cd57063493f7bc453c776918e15288" exitCode=0 Oct 07 14:26:54 crc kubenswrapper[4717]: I1007 14:26:54.241111 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dsf7" event={"ID":"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2","Type":"ContainerDied","Data":"7134b34d492e37bb58c2f39db6927051b0cd57063493f7bc453c776918e15288"} Oct 07 14:26:54 crc kubenswrapper[4717]: I1007 14:26:54.243993 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:27:01 crc kubenswrapper[4717]: I1007 14:27:01.609802 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:27:01 crc kubenswrapper[4717]: I1007 14:27:01.610438 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:27:03 crc kubenswrapper[4717]: I1007 14:27:03.320764 4717 generic.go:334] "Generic (PLEG): container finished" podID="4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2" containerID="9d957f0f2395519989f1d37f16e1a889a069865ccfd6637c6956c307817dfbdd" exitCode=0 Oct 07 14:27:03 crc kubenswrapper[4717]: I1007 14:27:03.321079 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dsf7" event={"ID":"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2","Type":"ContainerDied","Data":"9d957f0f2395519989f1d37f16e1a889a069865ccfd6637c6956c307817dfbdd"} Oct 07 14:27:06 crc kubenswrapper[4717]: I1007 14:27:06.349659 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dsf7" event={"ID":"4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2","Type":"ContainerStarted","Data":"d4cec5fa81d4470eb80920898c7c03de878453854137aba5a6cdacda1d2ab329"} Oct 07 14:27:06 crc kubenswrapper[4717]: I1007 14:27:06.368504 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5dsf7" podStartSLOduration=3.331452342 podStartE2EDuration="14.368481705s" podCreationTimestamp="2025-10-07 14:26:52 +0000 UTC" firstStartedPulling="2025-10-07 14:26:54.243722065 +0000 UTC m=+1996.071647857" lastFinishedPulling="2025-10-07 14:27:05.280751428 +0000 UTC m=+2007.108677220" observedRunningTime="2025-10-07 14:27:06.364026862 +0000 UTC m=+2008.191952654" watchObservedRunningTime="2025-10-07 14:27:06.368481705 +0000 UTC m=+2008.196407507" Oct 07 14:27:12 crc kubenswrapper[4717]: I1007 14:27:12.757154 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:27:12 crc kubenswrapper[4717]: I1007 14:27:12.757624 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:27:12 crc kubenswrapper[4717]: I1007 14:27:12.808816 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:27:13 crc kubenswrapper[4717]: I1007 14:27:13.471435 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5dsf7" Oct 07 14:27:13 crc kubenswrapper[4717]: I1007 14:27:13.571061 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5dsf7"] Oct 07 14:27:13 crc kubenswrapper[4717]: I1007 14:27:13.611649 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9pl8"] Oct 07 14:27:13 crc kubenswrapper[4717]: I1007 14:27:13.611887 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b9pl8" podUID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerName="registry-server" containerID="cri-o://e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e" gracePeriod=2 Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.083301 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.193857 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-utilities\") pod \"873fb065-5d2d-48cc-b5f2-2d65764ec045\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.194793 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-catalog-content\") pod \"873fb065-5d2d-48cc-b5f2-2d65764ec045\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.194842 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqbmw\" (UniqueName: \"kubernetes.io/projected/873fb065-5d2d-48cc-b5f2-2d65764ec045-kube-api-access-qqbmw\") pod \"873fb065-5d2d-48cc-b5f2-2d65764ec045\" (UID: \"873fb065-5d2d-48cc-b5f2-2d65764ec045\") " Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.195924 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-utilities" (OuterVolumeSpecName: "utilities") pod "873fb065-5d2d-48cc-b5f2-2d65764ec045" (UID: "873fb065-5d2d-48cc-b5f2-2d65764ec045"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.203449 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/873fb065-5d2d-48cc-b5f2-2d65764ec045-kube-api-access-qqbmw" (OuterVolumeSpecName: "kube-api-access-qqbmw") pod "873fb065-5d2d-48cc-b5f2-2d65764ec045" (UID: "873fb065-5d2d-48cc-b5f2-2d65764ec045"). InnerVolumeSpecName "kube-api-access-qqbmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.297856 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqbmw\" (UniqueName: \"kubernetes.io/projected/873fb065-5d2d-48cc-b5f2-2d65764ec045-kube-api-access-qqbmw\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.297904 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.336101 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "873fb065-5d2d-48cc-b5f2-2d65764ec045" (UID: "873fb065-5d2d-48cc-b5f2-2d65764ec045"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.399527 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/873fb065-5d2d-48cc-b5f2-2d65764ec045-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.429245 4717 generic.go:334] "Generic (PLEG): container finished" podID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerID="e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e" exitCode=0 Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.429301 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9pl8" event={"ID":"873fb065-5d2d-48cc-b5f2-2d65764ec045","Type":"ContainerDied","Data":"e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e"} Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.429345 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b9pl8" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.429376 4717 scope.go:117] "RemoveContainer" containerID="e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.429358 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b9pl8" event={"ID":"873fb065-5d2d-48cc-b5f2-2d65764ec045","Type":"ContainerDied","Data":"7aa66ab5a644fea132e6b5fdecb190dd843be8dff6a3cc5353fdf8cecb2f28dd"} Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.458029 4717 scope.go:117] "RemoveContainer" containerID="6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.467891 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b9pl8"] Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.480119 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b9pl8"] Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.499123 4717 scope.go:117] "RemoveContainer" containerID="36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.544426 4717 scope.go:117] "RemoveContainer" containerID="e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e" Oct 07 14:27:14 crc kubenswrapper[4717]: E1007 14:27:14.544976 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e\": container with ID starting with e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e not found: ID does not exist" containerID="e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.545090 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e"} err="failed to get container status \"e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e\": rpc error: code = NotFound desc = could not find container \"e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e\": container with ID starting with e23c057e46f15d606621b41c8c1ce5b94a323a885f95ca88b124636185e02d4e not found: ID does not exist" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.545127 4717 scope.go:117] "RemoveContainer" containerID="6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5" Oct 07 14:27:14 crc kubenswrapper[4717]: E1007 14:27:14.545453 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5\": container with ID starting with 6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5 not found: ID does not exist" containerID="6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.545493 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5"} err="failed to get container status \"6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5\": rpc error: code = NotFound desc = could not find container \"6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5\": container with ID starting with 6298714b5240f89674e745e7eaca175717a26a3fb8ff142315c0be3ed1c95eb5 not found: ID does not exist" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.545517 4717 scope.go:117] "RemoveContainer" containerID="36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68" Oct 07 14:27:14 crc kubenswrapper[4717]: E1007 14:27:14.545775 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68\": container with ID starting with 36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68 not found: ID does not exist" containerID="36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.545813 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68"} err="failed to get container status \"36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68\": rpc error: code = NotFound desc = could not find container \"36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68\": container with ID starting with 36233de2a03cf0e39d377d985fe758085a6ad093df6f131fc765181c01e72f68 not found: ID does not exist" Oct 07 14:27:14 crc kubenswrapper[4717]: I1007 14:27:14.878485 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="873fb065-5d2d-48cc-b5f2-2d65764ec045" path="/var/lib/kubelet/pods/873fb065-5d2d-48cc-b5f2-2d65764ec045/volumes" Oct 07 14:27:26 crc kubenswrapper[4717]: I1007 14:27:26.982784 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-65s9l"] Oct 07 14:27:26 crc kubenswrapper[4717]: E1007 14:27:26.984034 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerName="extract-utilities" Oct 07 14:27:26 crc kubenswrapper[4717]: I1007 14:27:26.984055 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerName="extract-utilities" Oct 07 14:27:26 crc kubenswrapper[4717]: E1007 14:27:26.984099 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerName="registry-server" Oct 07 14:27:26 crc kubenswrapper[4717]: I1007 14:27:26.984107 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerName="registry-server" Oct 07 14:27:26 crc kubenswrapper[4717]: E1007 14:27:26.984127 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerName="extract-content" Oct 07 14:27:26 crc kubenswrapper[4717]: I1007 14:27:26.984137 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerName="extract-content" Oct 07 14:27:26 crc kubenswrapper[4717]: I1007 14:27:26.984393 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="873fb065-5d2d-48cc-b5f2-2d65764ec045" containerName="registry-server" Oct 07 14:27:26 crc kubenswrapper[4717]: I1007 14:27:26.986888 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:26 crc kubenswrapper[4717]: I1007 14:27:26.998781 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65s9l"] Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.053378 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjbb9\" (UniqueName: \"kubernetes.io/projected/f3568721-9ebe-48c5-af84-12163c594c8f-kube-api-access-tjbb9\") pod \"certified-operators-65s9l\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.053528 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-catalog-content\") pod \"certified-operators-65s9l\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.053907 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-utilities\") pod \"certified-operators-65s9l\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.156339 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-utilities\") pod \"certified-operators-65s9l\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.156443 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjbb9\" (UniqueName: \"kubernetes.io/projected/f3568721-9ebe-48c5-af84-12163c594c8f-kube-api-access-tjbb9\") pod \"certified-operators-65s9l\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.156502 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-catalog-content\") pod \"certified-operators-65s9l\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.157112 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-catalog-content\") pod \"certified-operators-65s9l\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.157198 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-utilities\") pod \"certified-operators-65s9l\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.178646 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjbb9\" (UniqueName: \"kubernetes.io/projected/f3568721-9ebe-48c5-af84-12163c594c8f-kube-api-access-tjbb9\") pod \"certified-operators-65s9l\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.307336 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.556224 4717 generic.go:334] "Generic (PLEG): container finished" podID="27f71967-9d43-4b11-a286-1544c15adc41" containerID="a8dd62a88d680f666a7bdbfaff3c45ce34ae7823d22f32e627ad5cea1f89fd64" exitCode=0 Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.556308 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" event={"ID":"27f71967-9d43-4b11-a286-1544c15adc41","Type":"ContainerDied","Data":"a8dd62a88d680f666a7bdbfaff3c45ce34ae7823d22f32e627ad5cea1f89fd64"} Oct 07 14:27:27 crc kubenswrapper[4717]: I1007 14:27:27.797297 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65s9l"] Oct 07 14:27:27 crc kubenswrapper[4717]: W1007 14:27:27.804050 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3568721_9ebe_48c5_af84_12163c594c8f.slice/crio-d85e6d18c62a435819a28880ce80eac2152d194951e81c2b609e553b8819d398 WatchSource:0}: Error finding container d85e6d18c62a435819a28880ce80eac2152d194951e81c2b609e553b8819d398: Status 404 returned error can't find the container with id d85e6d18c62a435819a28880ce80eac2152d194951e81c2b609e553b8819d398 Oct 07 14:27:28 crc kubenswrapper[4717]: I1007 14:27:28.567598 4717 generic.go:334] "Generic (PLEG): container finished" podID="f3568721-9ebe-48c5-af84-12163c594c8f" containerID="73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60" exitCode=0 Oct 07 14:27:28 crc kubenswrapper[4717]: I1007 14:27:28.567718 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65s9l" event={"ID":"f3568721-9ebe-48c5-af84-12163c594c8f","Type":"ContainerDied","Data":"73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60"} Oct 07 14:27:28 crc kubenswrapper[4717]: I1007 14:27:28.567991 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65s9l" event={"ID":"f3568721-9ebe-48c5-af84-12163c594c8f","Type":"ContainerStarted","Data":"d85e6d18c62a435819a28880ce80eac2152d194951e81c2b609e553b8819d398"} Oct 07 14:27:28 crc kubenswrapper[4717]: I1007 14:27:28.961699 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.096782 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-ssh-key\") pod \"27f71967-9d43-4b11-a286-1544c15adc41\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.096842 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc88f\" (UniqueName: \"kubernetes.io/projected/27f71967-9d43-4b11-a286-1544c15adc41-kube-api-access-zc88f\") pod \"27f71967-9d43-4b11-a286-1544c15adc41\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.096899 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-inventory\") pod \"27f71967-9d43-4b11-a286-1544c15adc41\" (UID: \"27f71967-9d43-4b11-a286-1544c15adc41\") " Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.105312 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f71967-9d43-4b11-a286-1544c15adc41-kube-api-access-zc88f" (OuterVolumeSpecName: "kube-api-access-zc88f") pod "27f71967-9d43-4b11-a286-1544c15adc41" (UID: "27f71967-9d43-4b11-a286-1544c15adc41"). InnerVolumeSpecName "kube-api-access-zc88f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.150587 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-inventory" (OuterVolumeSpecName: "inventory") pod "27f71967-9d43-4b11-a286-1544c15adc41" (UID: "27f71967-9d43-4b11-a286-1544c15adc41"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.156789 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "27f71967-9d43-4b11-a286-1544c15adc41" (UID: "27f71967-9d43-4b11-a286-1544c15adc41"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.199416 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.199453 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc88f\" (UniqueName: \"kubernetes.io/projected/27f71967-9d43-4b11-a286-1544c15adc41-kube-api-access-zc88f\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.199468 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27f71967-9d43-4b11-a286-1544c15adc41-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.577390 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" event={"ID":"27f71967-9d43-4b11-a286-1544c15adc41","Type":"ContainerDied","Data":"dd2157ec33772d06a1c4e57df0ca16316dfd4087aca39e08c376b4305bfda5bc"} Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.577656 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd2157ec33772d06a1c4e57df0ca16316dfd4087aca39e08c376b4305bfda5bc" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.577498 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.662867 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d9g97"] Oct 07 14:27:29 crc kubenswrapper[4717]: E1007 14:27:29.663364 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f71967-9d43-4b11-a286-1544c15adc41" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.663390 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f71967-9d43-4b11-a286-1544c15adc41" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.663603 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f71967-9d43-4b11-a286-1544c15adc41" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.664384 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.670528 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.670529 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.670527 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.670731 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.678839 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d9g97"] Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.708471 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d9g97\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.708522 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5l49\" (UniqueName: \"kubernetes.io/projected/17107a9c-f715-4bd1-88ac-d79c769fd4e4-kube-api-access-t5l49\") pod \"ssh-known-hosts-edpm-deployment-d9g97\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.708607 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d9g97\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.809916 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d9g97\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.810101 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d9g97\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.810140 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5l49\" (UniqueName: \"kubernetes.io/projected/17107a9c-f715-4bd1-88ac-d79c769fd4e4-kube-api-access-t5l49\") pod \"ssh-known-hosts-edpm-deployment-d9g97\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.815120 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d9g97\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.818558 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d9g97\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.826878 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5l49\" (UniqueName: \"kubernetes.io/projected/17107a9c-f715-4bd1-88ac-d79c769fd4e4-kube-api-access-t5l49\") pod \"ssh-known-hosts-edpm-deployment-d9g97\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:29 crc kubenswrapper[4717]: I1007 14:27:29.990179 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:30 crc kubenswrapper[4717]: I1007 14:27:30.511403 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d9g97"] Oct 07 14:27:30 crc kubenswrapper[4717]: I1007 14:27:30.586584 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" event={"ID":"17107a9c-f715-4bd1-88ac-d79c769fd4e4","Type":"ContainerStarted","Data":"b06272c91c8843ca07b8ac3a96c569f97134414dbdc45cda89366cb6efb02627"} Oct 07 14:27:30 crc kubenswrapper[4717]: I1007 14:27:30.589657 4717 generic.go:334] "Generic (PLEG): container finished" podID="f3568721-9ebe-48c5-af84-12163c594c8f" containerID="c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329" exitCode=0 Oct 07 14:27:30 crc kubenswrapper[4717]: I1007 14:27:30.589685 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65s9l" event={"ID":"f3568721-9ebe-48c5-af84-12163c594c8f","Type":"ContainerDied","Data":"c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329"} Oct 07 14:27:31 crc kubenswrapper[4717]: I1007 14:27:31.610223 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:27:31 crc kubenswrapper[4717]: I1007 14:27:31.610509 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:27:31 crc kubenswrapper[4717]: I1007 14:27:31.610563 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:27:31 crc kubenswrapper[4717]: I1007 14:27:31.611318 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e96337bcb79192f8c77c0f06850a719e98effc004e5bae83edcb61766e65fad7"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:27:31 crc kubenswrapper[4717]: I1007 14:27:31.611376 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://e96337bcb79192f8c77c0f06850a719e98effc004e5bae83edcb61766e65fad7" gracePeriod=600 Oct 07 14:27:32 crc kubenswrapper[4717]: I1007 14:27:32.630192 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65s9l" event={"ID":"f3568721-9ebe-48c5-af84-12163c594c8f","Type":"ContainerStarted","Data":"16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2"} Oct 07 14:27:32 crc kubenswrapper[4717]: I1007 14:27:32.635724 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="e96337bcb79192f8c77c0f06850a719e98effc004e5bae83edcb61766e65fad7" exitCode=0 Oct 07 14:27:32 crc kubenswrapper[4717]: I1007 14:27:32.635822 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"e96337bcb79192f8c77c0f06850a719e98effc004e5bae83edcb61766e65fad7"} Oct 07 14:27:32 crc kubenswrapper[4717]: I1007 14:27:32.635854 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d"} Oct 07 14:27:32 crc kubenswrapper[4717]: I1007 14:27:32.635876 4717 scope.go:117] "RemoveContainer" containerID="dfd659d9b41d8f2f096d86d50f91eb554e958d185ae45273781fd94eeefa8b86" Oct 07 14:27:32 crc kubenswrapper[4717]: I1007 14:27:32.639964 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" event={"ID":"17107a9c-f715-4bd1-88ac-d79c769fd4e4","Type":"ContainerStarted","Data":"238a9ef60a206836f38edb4357bccb1145934255d1bdff9763533bd5467a41ee"} Oct 07 14:27:32 crc kubenswrapper[4717]: I1007 14:27:32.681420 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-65s9l" podStartSLOduration=3.782082058 podStartE2EDuration="6.681393141s" podCreationTimestamp="2025-10-07 14:27:26 +0000 UTC" firstStartedPulling="2025-10-07 14:27:28.56928371 +0000 UTC m=+2030.397209502" lastFinishedPulling="2025-10-07 14:27:31.468594793 +0000 UTC m=+2033.296520585" observedRunningTime="2025-10-07 14:27:32.653337719 +0000 UTC m=+2034.481263521" watchObservedRunningTime="2025-10-07 14:27:32.681393141 +0000 UTC m=+2034.509318953" Oct 07 14:27:32 crc kubenswrapper[4717]: I1007 14:27:32.689788 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" podStartSLOduration=2.859544922 podStartE2EDuration="3.689766041s" podCreationTimestamp="2025-10-07 14:27:29 +0000 UTC" firstStartedPulling="2025-10-07 14:27:30.52110625 +0000 UTC m=+2032.349032052" lastFinishedPulling="2025-10-07 14:27:31.351327379 +0000 UTC m=+2033.179253171" observedRunningTime="2025-10-07 14:27:32.685921535 +0000 UTC m=+2034.513847327" watchObservedRunningTime="2025-10-07 14:27:32.689766041 +0000 UTC m=+2034.517691833" Oct 07 14:27:37 crc kubenswrapper[4717]: I1007 14:27:37.311329 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:37 crc kubenswrapper[4717]: I1007 14:27:37.312836 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:37 crc kubenswrapper[4717]: I1007 14:27:37.367086 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:37 crc kubenswrapper[4717]: I1007 14:27:37.727627 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:37 crc kubenswrapper[4717]: I1007 14:27:37.774795 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-65s9l"] Oct 07 14:27:39 crc kubenswrapper[4717]: I1007 14:27:39.701040 4717 generic.go:334] "Generic (PLEG): container finished" podID="17107a9c-f715-4bd1-88ac-d79c769fd4e4" containerID="238a9ef60a206836f38edb4357bccb1145934255d1bdff9763533bd5467a41ee" exitCode=0 Oct 07 14:27:39 crc kubenswrapper[4717]: I1007 14:27:39.701119 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" event={"ID":"17107a9c-f715-4bd1-88ac-d79c769fd4e4","Type":"ContainerDied","Data":"238a9ef60a206836f38edb4357bccb1145934255d1bdff9763533bd5467a41ee"} Oct 07 14:27:39 crc kubenswrapper[4717]: I1007 14:27:39.701747 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-65s9l" podUID="f3568721-9ebe-48c5-af84-12163c594c8f" containerName="registry-server" containerID="cri-o://16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2" gracePeriod=2 Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.634076 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.713398 4717 generic.go:334] "Generic (PLEG): container finished" podID="f3568721-9ebe-48c5-af84-12163c594c8f" containerID="16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2" exitCode=0 Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.713479 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65s9l" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.713473 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65s9l" event={"ID":"f3568721-9ebe-48c5-af84-12163c594c8f","Type":"ContainerDied","Data":"16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2"} Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.713561 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65s9l" event={"ID":"f3568721-9ebe-48c5-af84-12163c594c8f","Type":"ContainerDied","Data":"d85e6d18c62a435819a28880ce80eac2152d194951e81c2b609e553b8819d398"} Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.713589 4717 scope.go:117] "RemoveContainer" containerID="16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.736810 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjbb9\" (UniqueName: \"kubernetes.io/projected/f3568721-9ebe-48c5-af84-12163c594c8f-kube-api-access-tjbb9\") pod \"f3568721-9ebe-48c5-af84-12163c594c8f\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.737044 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-catalog-content\") pod \"f3568721-9ebe-48c5-af84-12163c594c8f\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.737121 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-utilities\") pod \"f3568721-9ebe-48c5-af84-12163c594c8f\" (UID: \"f3568721-9ebe-48c5-af84-12163c594c8f\") " Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.738242 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-utilities" (OuterVolumeSpecName: "utilities") pod "f3568721-9ebe-48c5-af84-12163c594c8f" (UID: "f3568721-9ebe-48c5-af84-12163c594c8f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.752360 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3568721-9ebe-48c5-af84-12163c594c8f-kube-api-access-tjbb9" (OuterVolumeSpecName: "kube-api-access-tjbb9") pod "f3568721-9ebe-48c5-af84-12163c594c8f" (UID: "f3568721-9ebe-48c5-af84-12163c594c8f"). InnerVolumeSpecName "kube-api-access-tjbb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.752775 4717 scope.go:117] "RemoveContainer" containerID="c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.821034 4717 scope.go:117] "RemoveContainer" containerID="73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.840710 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.840745 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjbb9\" (UniqueName: \"kubernetes.io/projected/f3568721-9ebe-48c5-af84-12163c594c8f-kube-api-access-tjbb9\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.864365 4717 scope.go:117] "RemoveContainer" containerID="16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2" Oct 07 14:27:40 crc kubenswrapper[4717]: E1007 14:27:40.864855 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2\": container with ID starting with 16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2 not found: ID does not exist" containerID="16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.864894 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2"} err="failed to get container status \"16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2\": rpc error: code = NotFound desc = could not find container \"16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2\": container with ID starting with 16cfff88ce3e9a00fdb0794327cfa5792e66ef581661c4e8a41de4a76f44f9b2 not found: ID does not exist" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.864920 4717 scope.go:117] "RemoveContainer" containerID="c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329" Oct 07 14:27:40 crc kubenswrapper[4717]: E1007 14:27:40.865378 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329\": container with ID starting with c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329 not found: ID does not exist" containerID="c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.865408 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329"} err="failed to get container status \"c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329\": rpc error: code = NotFound desc = could not find container \"c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329\": container with ID starting with c80845087eb5fb7be3e3701c0e0c0055adcdb78a40f91b300eaf39b465c80329 not found: ID does not exist" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.865429 4717 scope.go:117] "RemoveContainer" containerID="73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60" Oct 07 14:27:40 crc kubenswrapper[4717]: E1007 14:27:40.865726 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60\": container with ID starting with 73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60 not found: ID does not exist" containerID="73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60" Oct 07 14:27:40 crc kubenswrapper[4717]: I1007 14:27:40.865761 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60"} err="failed to get container status \"73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60\": rpc error: code = NotFound desc = could not find container \"73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60\": container with ID starting with 73a605895f7c80f91e095fc7e02c3457fab7352fda5d4bd8e5fc613483567a60 not found: ID does not exist" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.220642 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.350403 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-ssh-key-openstack-edpm-ipam\") pod \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.350458 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-inventory-0\") pod \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.350477 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5l49\" (UniqueName: \"kubernetes.io/projected/17107a9c-f715-4bd1-88ac-d79c769fd4e4-kube-api-access-t5l49\") pod \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\" (UID: \"17107a9c-f715-4bd1-88ac-d79c769fd4e4\") " Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.357946 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17107a9c-f715-4bd1-88ac-d79c769fd4e4-kube-api-access-t5l49" (OuterVolumeSpecName: "kube-api-access-t5l49") pod "17107a9c-f715-4bd1-88ac-d79c769fd4e4" (UID: "17107a9c-f715-4bd1-88ac-d79c769fd4e4"). InnerVolumeSpecName "kube-api-access-t5l49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.364717 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3568721-9ebe-48c5-af84-12163c594c8f" (UID: "f3568721-9ebe-48c5-af84-12163c594c8f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.379068 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "17107a9c-f715-4bd1-88ac-d79c769fd4e4" (UID: "17107a9c-f715-4bd1-88ac-d79c769fd4e4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.379197 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "17107a9c-f715-4bd1-88ac-d79c769fd4e4" (UID: "17107a9c-f715-4bd1-88ac-d79c769fd4e4"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.453502 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3568721-9ebe-48c5-af84-12163c594c8f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.453777 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.453791 4717 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/17107a9c-f715-4bd1-88ac-d79c769fd4e4-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.453799 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5l49\" (UniqueName: \"kubernetes.io/projected/17107a9c-f715-4bd1-88ac-d79c769fd4e4-kube-api-access-t5l49\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.658617 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-65s9l"] Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.667328 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-65s9l"] Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.724516 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" event={"ID":"17107a9c-f715-4bd1-88ac-d79c769fd4e4","Type":"ContainerDied","Data":"b06272c91c8843ca07b8ac3a96c569f97134414dbdc45cda89366cb6efb02627"} Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.724571 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b06272c91c8843ca07b8ac3a96c569f97134414dbdc45cda89366cb6efb02627" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.724570 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d9g97" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.785339 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk"] Oct 07 14:27:41 crc kubenswrapper[4717]: E1007 14:27:41.785782 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3568721-9ebe-48c5-af84-12163c594c8f" containerName="extract-utilities" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.785806 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3568721-9ebe-48c5-af84-12163c594c8f" containerName="extract-utilities" Oct 07 14:27:41 crc kubenswrapper[4717]: E1007 14:27:41.785817 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17107a9c-f715-4bd1-88ac-d79c769fd4e4" containerName="ssh-known-hosts-edpm-deployment" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.785824 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="17107a9c-f715-4bd1-88ac-d79c769fd4e4" containerName="ssh-known-hosts-edpm-deployment" Oct 07 14:27:41 crc kubenswrapper[4717]: E1007 14:27:41.785852 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3568721-9ebe-48c5-af84-12163c594c8f" containerName="extract-content" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.785858 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3568721-9ebe-48c5-af84-12163c594c8f" containerName="extract-content" Oct 07 14:27:41 crc kubenswrapper[4717]: E1007 14:27:41.785866 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3568721-9ebe-48c5-af84-12163c594c8f" containerName="registry-server" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.785873 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3568721-9ebe-48c5-af84-12163c594c8f" containerName="registry-server" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.786070 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="17107a9c-f715-4bd1-88ac-d79c769fd4e4" containerName="ssh-known-hosts-edpm-deployment" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.786092 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3568721-9ebe-48c5-af84-12163c594c8f" containerName="registry-server" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.794373 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk"] Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.794477 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.797429 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.797538 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.797724 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.797945 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.865546 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xqjk\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.865613 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xqjk\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:41 crc kubenswrapper[4717]: I1007 14:27:41.866052 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npw7\" (UniqueName: \"kubernetes.io/projected/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-kube-api-access-9npw7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xqjk\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:42 crc kubenswrapper[4717]: I1007 14:27:42.020219 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npw7\" (UniqueName: \"kubernetes.io/projected/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-kube-api-access-9npw7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xqjk\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:42 crc kubenswrapper[4717]: I1007 14:27:42.020398 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xqjk\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:42 crc kubenswrapper[4717]: I1007 14:27:42.020449 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xqjk\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:42 crc kubenswrapper[4717]: I1007 14:27:42.026615 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xqjk\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:42 crc kubenswrapper[4717]: I1007 14:27:42.027535 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xqjk\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:42 crc kubenswrapper[4717]: I1007 14:27:42.040513 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npw7\" (UniqueName: \"kubernetes.io/projected/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-kube-api-access-9npw7\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-9xqjk\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:42 crc kubenswrapper[4717]: I1007 14:27:42.110342 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:42 crc kubenswrapper[4717]: I1007 14:27:42.646480 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk"] Oct 07 14:27:42 crc kubenswrapper[4717]: W1007 14:27:42.646518 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4d5f4c4_7e2a_46b8_8331_3372d6a7e825.slice/crio-f28d18e1bae1a9adbff0dde78378463829a8e07c3c32d1fc2841bc5eeb5a32e9 WatchSource:0}: Error finding container f28d18e1bae1a9adbff0dde78378463829a8e07c3c32d1fc2841bc5eeb5a32e9: Status 404 returned error can't find the container with id f28d18e1bae1a9adbff0dde78378463829a8e07c3c32d1fc2841bc5eeb5a32e9 Oct 07 14:27:42 crc kubenswrapper[4717]: I1007 14:27:42.734841 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" event={"ID":"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825","Type":"ContainerStarted","Data":"f28d18e1bae1a9adbff0dde78378463829a8e07c3c32d1fc2841bc5eeb5a32e9"} Oct 07 14:27:42 crc kubenswrapper[4717]: I1007 14:27:42.879856 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3568721-9ebe-48c5-af84-12163c594c8f" path="/var/lib/kubelet/pods/f3568721-9ebe-48c5-af84-12163c594c8f/volumes" Oct 07 14:27:43 crc kubenswrapper[4717]: I1007 14:27:43.744394 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" event={"ID":"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825","Type":"ContainerStarted","Data":"566f57daa1c6a05644197db1cd724234becd48858dbcf85955106ecbd5c0cf4d"} Oct 07 14:27:43 crc kubenswrapper[4717]: I1007 14:27:43.763509 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" podStartSLOduration=2.215953217 podStartE2EDuration="2.763488853s" podCreationTimestamp="2025-10-07 14:27:41 +0000 UTC" firstStartedPulling="2025-10-07 14:27:42.649933104 +0000 UTC m=+2044.477858896" lastFinishedPulling="2025-10-07 14:27:43.19746874 +0000 UTC m=+2045.025394532" observedRunningTime="2025-10-07 14:27:43.761285873 +0000 UTC m=+2045.589211675" watchObservedRunningTime="2025-10-07 14:27:43.763488853 +0000 UTC m=+2045.591414645" Oct 07 14:27:51 crc kubenswrapper[4717]: I1007 14:27:51.808980 4717 generic.go:334] "Generic (PLEG): container finished" podID="c4d5f4c4-7e2a-46b8-8331-3372d6a7e825" containerID="566f57daa1c6a05644197db1cd724234becd48858dbcf85955106ecbd5c0cf4d" exitCode=0 Oct 07 14:27:51 crc kubenswrapper[4717]: I1007 14:27:51.809080 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" event={"ID":"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825","Type":"ContainerDied","Data":"566f57daa1c6a05644197db1cd724234becd48858dbcf85955106ecbd5c0cf4d"} Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.256942 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.363153 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-inventory\") pod \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.363239 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9npw7\" (UniqueName: \"kubernetes.io/projected/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-kube-api-access-9npw7\") pod \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.363360 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-ssh-key\") pod \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\" (UID: \"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825\") " Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.375978 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-kube-api-access-9npw7" (OuterVolumeSpecName: "kube-api-access-9npw7") pod "c4d5f4c4-7e2a-46b8-8331-3372d6a7e825" (UID: "c4d5f4c4-7e2a-46b8-8331-3372d6a7e825"). InnerVolumeSpecName "kube-api-access-9npw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.394404 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c4d5f4c4-7e2a-46b8-8331-3372d6a7e825" (UID: "c4d5f4c4-7e2a-46b8-8331-3372d6a7e825"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.396289 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-inventory" (OuterVolumeSpecName: "inventory") pod "c4d5f4c4-7e2a-46b8-8331-3372d6a7e825" (UID: "c4d5f4c4-7e2a-46b8-8331-3372d6a7e825"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.465747 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.465791 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9npw7\" (UniqueName: \"kubernetes.io/projected/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-kube-api-access-9npw7\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.465814 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4d5f4c4-7e2a-46b8-8331-3372d6a7e825-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.826557 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" event={"ID":"c4d5f4c4-7e2a-46b8-8331-3372d6a7e825","Type":"ContainerDied","Data":"f28d18e1bae1a9adbff0dde78378463829a8e07c3c32d1fc2841bc5eeb5a32e9"} Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.826873 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f28d18e1bae1a9adbff0dde78378463829a8e07c3c32d1fc2841bc5eeb5a32e9" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.826613 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-9xqjk" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.894912 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5"] Oct 07 14:27:53 crc kubenswrapper[4717]: E1007 14:27:53.895369 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d5f4c4-7e2a-46b8-8331-3372d6a7e825" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.895391 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d5f4c4-7e2a-46b8-8331-3372d6a7e825" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.895599 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d5f4c4-7e2a-46b8-8331-3372d6a7e825" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.896378 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.899524 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.899624 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.899665 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.899692 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:27:53 crc kubenswrapper[4717]: I1007 14:27:53.909159 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5"] Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.075625 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.075682 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82qmj\" (UniqueName: \"kubernetes.io/projected/bf9bf972-fbac-4b24-bf35-2cf668fca79d-kube-api-access-82qmj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.075831 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.178229 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.178682 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.178838 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82qmj\" (UniqueName: \"kubernetes.io/projected/bf9bf972-fbac-4b24-bf35-2cf668fca79d-kube-api-access-82qmj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.182672 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.194864 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.202708 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82qmj\" (UniqueName: \"kubernetes.io/projected/bf9bf972-fbac-4b24-bf35-2cf668fca79d-kube-api-access-82qmj\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.216648 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.725179 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5"] Oct 07 14:27:54 crc kubenswrapper[4717]: W1007 14:27:54.730607 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf9bf972_fbac_4b24_bf35_2cf668fca79d.slice/crio-c4fe32e169290b599d5db3f3c118aeaa0bdb79799296d7300f0c45377f3c1610 WatchSource:0}: Error finding container c4fe32e169290b599d5db3f3c118aeaa0bdb79799296d7300f0c45377f3c1610: Status 404 returned error can't find the container with id c4fe32e169290b599d5db3f3c118aeaa0bdb79799296d7300f0c45377f3c1610 Oct 07 14:27:54 crc kubenswrapper[4717]: I1007 14:27:54.835372 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" event={"ID":"bf9bf972-fbac-4b24-bf35-2cf668fca79d","Type":"ContainerStarted","Data":"c4fe32e169290b599d5db3f3c118aeaa0bdb79799296d7300f0c45377f3c1610"} Oct 07 14:27:55 crc kubenswrapper[4717]: I1007 14:27:55.844306 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" event={"ID":"bf9bf972-fbac-4b24-bf35-2cf668fca79d","Type":"ContainerStarted","Data":"637303d1ce9fe67e61487def4796ebd8db8ac927528c3ad31353d6dc06396c8e"} Oct 07 14:27:55 crc kubenswrapper[4717]: I1007 14:27:55.862980 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" podStartSLOduration=2.383882588 podStartE2EDuration="2.862961561s" podCreationTimestamp="2025-10-07 14:27:53 +0000 UTC" firstStartedPulling="2025-10-07 14:27:54.732853717 +0000 UTC m=+2056.560779509" lastFinishedPulling="2025-10-07 14:27:55.21193269 +0000 UTC m=+2057.039858482" observedRunningTime="2025-10-07 14:27:55.860699079 +0000 UTC m=+2057.688624871" watchObservedRunningTime="2025-10-07 14:27:55.862961561 +0000 UTC m=+2057.690887353" Oct 07 14:28:04 crc kubenswrapper[4717]: I1007 14:28:04.921522 4717 generic.go:334] "Generic (PLEG): container finished" podID="bf9bf972-fbac-4b24-bf35-2cf668fca79d" containerID="637303d1ce9fe67e61487def4796ebd8db8ac927528c3ad31353d6dc06396c8e" exitCode=0 Oct 07 14:28:04 crc kubenswrapper[4717]: I1007 14:28:04.921622 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" event={"ID":"bf9bf972-fbac-4b24-bf35-2cf668fca79d","Type":"ContainerDied","Data":"637303d1ce9fe67e61487def4796ebd8db8ac927528c3ad31353d6dc06396c8e"} Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.318847 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.465171 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-ssh-key\") pod \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.465326 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-inventory\") pod \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.465375 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82qmj\" (UniqueName: \"kubernetes.io/projected/bf9bf972-fbac-4b24-bf35-2cf668fca79d-kube-api-access-82qmj\") pod \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\" (UID: \"bf9bf972-fbac-4b24-bf35-2cf668fca79d\") " Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.471236 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9bf972-fbac-4b24-bf35-2cf668fca79d-kube-api-access-82qmj" (OuterVolumeSpecName: "kube-api-access-82qmj") pod "bf9bf972-fbac-4b24-bf35-2cf668fca79d" (UID: "bf9bf972-fbac-4b24-bf35-2cf668fca79d"). InnerVolumeSpecName "kube-api-access-82qmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.495130 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-inventory" (OuterVolumeSpecName: "inventory") pod "bf9bf972-fbac-4b24-bf35-2cf668fca79d" (UID: "bf9bf972-fbac-4b24-bf35-2cf668fca79d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.498339 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bf9bf972-fbac-4b24-bf35-2cf668fca79d" (UID: "bf9bf972-fbac-4b24-bf35-2cf668fca79d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.567393 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.567428 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf9bf972-fbac-4b24-bf35-2cf668fca79d-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.567438 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82qmj\" (UniqueName: \"kubernetes.io/projected/bf9bf972-fbac-4b24-bf35-2cf668fca79d-kube-api-access-82qmj\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.939273 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" event={"ID":"bf9bf972-fbac-4b24-bf35-2cf668fca79d","Type":"ContainerDied","Data":"c4fe32e169290b599d5db3f3c118aeaa0bdb79799296d7300f0c45377f3c1610"} Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.939567 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4fe32e169290b599d5db3f3c118aeaa0bdb79799296d7300f0c45377f3c1610" Oct 07 14:28:06 crc kubenswrapper[4717]: I1007 14:28:06.939317 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.019465 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc"] Oct 07 14:28:07 crc kubenswrapper[4717]: E1007 14:28:07.021165 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9bf972-fbac-4b24-bf35-2cf668fca79d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.021187 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9bf972-fbac-4b24-bf35-2cf668fca79d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.021435 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9bf972-fbac-4b24-bf35-2cf668fca79d" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.022215 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.027037 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.027083 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.027345 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.027346 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.027486 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.027527 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.027557 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.027353 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.042274 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc"] Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.178965 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179055 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179083 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179103 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179127 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179143 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179185 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179224 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179244 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179276 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pxjv\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-kube-api-access-8pxjv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179335 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179353 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179384 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.179413 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.281715 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.281815 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.281851 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.281898 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pxjv\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-kube-api-access-8pxjv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.282398 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.282426 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.282885 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.282951 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.283069 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.283117 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.283140 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.283162 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.283192 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.283215 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.287899 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.287957 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.288810 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.289506 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.289661 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.289795 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.291361 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.291600 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.291858 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.292319 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.292660 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.299254 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.301556 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pxjv\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-kube-api-access-8pxjv\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.301736 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-kljwc\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.344879 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.870763 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc"] Oct 07 14:28:07 crc kubenswrapper[4717]: I1007 14:28:07.947655 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" event={"ID":"fda137df-3fa6-470a-b41a-db9f55a550ab","Type":"ContainerStarted","Data":"e33d6752bd81d6cf3465a66ea579aae91f145618fe92e7bdcfb0598501e59224"} Oct 07 14:28:09 crc kubenswrapper[4717]: I1007 14:28:09.964645 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" event={"ID":"fda137df-3fa6-470a-b41a-db9f55a550ab","Type":"ContainerStarted","Data":"e2f3df2a8a385fcfa499c44f9842e34166a4df085611d2560df2422b7ed9b0a6"} Oct 07 14:28:09 crc kubenswrapper[4717]: I1007 14:28:09.999761 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" podStartSLOduration=3.09145559 podStartE2EDuration="3.999736275s" podCreationTimestamp="2025-10-07 14:28:06 +0000 UTC" firstStartedPulling="2025-10-07 14:28:07.882659921 +0000 UTC m=+2069.710585713" lastFinishedPulling="2025-10-07 14:28:08.790940606 +0000 UTC m=+2070.618866398" observedRunningTime="2025-10-07 14:28:09.986245684 +0000 UTC m=+2071.814171496" watchObservedRunningTime="2025-10-07 14:28:09.999736275 +0000 UTC m=+2071.827662067" Oct 07 14:28:49 crc kubenswrapper[4717]: I1007 14:28:49.357077 4717 generic.go:334] "Generic (PLEG): container finished" podID="fda137df-3fa6-470a-b41a-db9f55a550ab" containerID="e2f3df2a8a385fcfa499c44f9842e34166a4df085611d2560df2422b7ed9b0a6" exitCode=0 Oct 07 14:28:49 crc kubenswrapper[4717]: I1007 14:28:49.357163 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" event={"ID":"fda137df-3fa6-470a-b41a-db9f55a550ab","Type":"ContainerDied","Data":"e2f3df2a8a385fcfa499c44f9842e34166a4df085611d2560df2422b7ed9b0a6"} Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.803560 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.912503 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-repo-setup-combined-ca-bundle\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.912594 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pxjv\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-kube-api-access-8pxjv\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.912632 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.912830 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-telemetry-combined-ca-bundle\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.913084 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ssh-key\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.913144 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-neutron-metadata-combined-ca-bundle\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.913176 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-nova-combined-ca-bundle\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.913242 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ovn-combined-ca-bundle\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.913304 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-inventory\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.913364 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-bootstrap-combined-ca-bundle\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.913400 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-libvirt-combined-ca-bundle\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.913421 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-ovn-default-certs-0\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.913444 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.913467 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"fda137df-3fa6-470a-b41a-db9f55a550ab\" (UID: \"fda137df-3fa6-470a-b41a-db9f55a550ab\") " Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.918834 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.918840 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.919225 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.920092 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.920817 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.922091 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-kube-api-access-8pxjv" (OuterVolumeSpecName: "kube-api-access-8pxjv") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "kube-api-access-8pxjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.922274 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.923028 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.923977 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.924285 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.925273 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.942202 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.951881 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:50 crc kubenswrapper[4717]: I1007 14:28:50.952929 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-inventory" (OuterVolumeSpecName: "inventory") pod "fda137df-3fa6-470a-b41a-db9f55a550ab" (UID: "fda137df-3fa6-470a-b41a-db9f55a550ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.017682 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018226 4717 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018251 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018265 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018304 4717 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018322 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018333 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018344 4717 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018358 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018369 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018381 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018395 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018409 4717 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda137df-3fa6-470a-b41a-db9f55a550ab-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.018420 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pxjv\" (UniqueName: \"kubernetes.io/projected/fda137df-3fa6-470a-b41a-db9f55a550ab-kube-api-access-8pxjv\") on node \"crc\" DevicePath \"\"" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.376148 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" event={"ID":"fda137df-3fa6-470a-b41a-db9f55a550ab","Type":"ContainerDied","Data":"e33d6752bd81d6cf3465a66ea579aae91f145618fe92e7bdcfb0598501e59224"} Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.376396 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e33d6752bd81d6cf3465a66ea579aae91f145618fe92e7bdcfb0598501e59224" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.376348 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-kljwc" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.497817 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2"] Oct 07 14:28:51 crc kubenswrapper[4717]: E1007 14:28:51.498361 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda137df-3fa6-470a-b41a-db9f55a550ab" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.498389 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda137df-3fa6-470a-b41a-db9f55a550ab" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.498615 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="fda137df-3fa6-470a-b41a-db9f55a550ab" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.499388 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.501959 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.502034 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.502222 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.502562 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.502685 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.505168 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2"] Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.630992 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnbmd\" (UniqueName: \"kubernetes.io/projected/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-kube-api-access-cnbmd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.631086 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.631144 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.631245 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.631517 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.733616 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.733687 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnbmd\" (UniqueName: \"kubernetes.io/projected/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-kube-api-access-cnbmd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.733722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.733776 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.733820 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.734709 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.738385 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.738463 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.754342 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.757879 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnbmd\" (UniqueName: \"kubernetes.io/projected/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-kube-api-access-cnbmd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-xgpx2\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:51 crc kubenswrapper[4717]: I1007 14:28:51.825898 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:28:52 crc kubenswrapper[4717]: I1007 14:28:52.339922 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2"] Oct 07 14:28:52 crc kubenswrapper[4717]: W1007 14:28:52.341956 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29dd3c38_62bb_4f7c_9cef_7ab420156b0c.slice/crio-471ecd900a9a7b794353b3cf536817bf1ecc3d1418f5d736c2b933efec2477d8 WatchSource:0}: Error finding container 471ecd900a9a7b794353b3cf536817bf1ecc3d1418f5d736c2b933efec2477d8: Status 404 returned error can't find the container with id 471ecd900a9a7b794353b3cf536817bf1ecc3d1418f5d736c2b933efec2477d8 Oct 07 14:28:52 crc kubenswrapper[4717]: I1007 14:28:52.388343 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" event={"ID":"29dd3c38-62bb-4f7c-9cef-7ab420156b0c","Type":"ContainerStarted","Data":"471ecd900a9a7b794353b3cf536817bf1ecc3d1418f5d736c2b933efec2477d8"} Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.378673 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ntggd"] Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.381470 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.391450 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ntggd"] Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.414974 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" event={"ID":"29dd3c38-62bb-4f7c-9cef-7ab420156b0c","Type":"ContainerStarted","Data":"84dc12004843bb65f4a88c5033fed820924abce5acf95851f0c1c775fc385014"} Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.434433 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" podStartSLOduration=2.054416749 podStartE2EDuration="2.434403819s" podCreationTimestamp="2025-10-07 14:28:51 +0000 UTC" firstStartedPulling="2025-10-07 14:28:52.343803979 +0000 UTC m=+2114.171729771" lastFinishedPulling="2025-10-07 14:28:52.723791049 +0000 UTC m=+2114.551716841" observedRunningTime="2025-10-07 14:28:53.431634593 +0000 UTC m=+2115.259560395" watchObservedRunningTime="2025-10-07 14:28:53.434403819 +0000 UTC m=+2115.262329611" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.475766 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-utilities\") pod \"community-operators-ntggd\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.475847 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzlmh\" (UniqueName: \"kubernetes.io/projected/e762617f-6329-4789-b11d-152bd16139ea-kube-api-access-jzlmh\") pod \"community-operators-ntggd\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.475955 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-catalog-content\") pod \"community-operators-ntggd\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.577283 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-utilities\") pod \"community-operators-ntggd\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.577350 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzlmh\" (UniqueName: \"kubernetes.io/projected/e762617f-6329-4789-b11d-152bd16139ea-kube-api-access-jzlmh\") pod \"community-operators-ntggd\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.577402 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-catalog-content\") pod \"community-operators-ntggd\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.577854 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-utilities\") pod \"community-operators-ntggd\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.577872 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-catalog-content\") pod \"community-operators-ntggd\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.598537 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzlmh\" (UniqueName: \"kubernetes.io/projected/e762617f-6329-4789-b11d-152bd16139ea-kube-api-access-jzlmh\") pod \"community-operators-ntggd\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:53 crc kubenswrapper[4717]: I1007 14:28:53.717133 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:28:54 crc kubenswrapper[4717]: I1007 14:28:54.250200 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ntggd"] Oct 07 14:28:54 crc kubenswrapper[4717]: I1007 14:28:54.430260 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntggd" event={"ID":"e762617f-6329-4789-b11d-152bd16139ea","Type":"ContainerStarted","Data":"27c414c835cd8c92daf1c3ff5ef949cb306bd8bc4b89f79db766faa7bcd77336"} Oct 07 14:28:55 crc kubenswrapper[4717]: I1007 14:28:55.437177 4717 generic.go:334] "Generic (PLEG): container finished" podID="e762617f-6329-4789-b11d-152bd16139ea" containerID="7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b" exitCode=0 Oct 07 14:28:55 crc kubenswrapper[4717]: I1007 14:28:55.437224 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntggd" event={"ID":"e762617f-6329-4789-b11d-152bd16139ea","Type":"ContainerDied","Data":"7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b"} Oct 07 14:28:56 crc kubenswrapper[4717]: I1007 14:28:56.447146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntggd" event={"ID":"e762617f-6329-4789-b11d-152bd16139ea","Type":"ContainerStarted","Data":"ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1"} Oct 07 14:28:57 crc kubenswrapper[4717]: I1007 14:28:57.459884 4717 generic.go:334] "Generic (PLEG): container finished" podID="e762617f-6329-4789-b11d-152bd16139ea" containerID="ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1" exitCode=0 Oct 07 14:28:57 crc kubenswrapper[4717]: I1007 14:28:57.459925 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntggd" event={"ID":"e762617f-6329-4789-b11d-152bd16139ea","Type":"ContainerDied","Data":"ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1"} Oct 07 14:28:58 crc kubenswrapper[4717]: I1007 14:28:58.470101 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntggd" event={"ID":"e762617f-6329-4789-b11d-152bd16139ea","Type":"ContainerStarted","Data":"21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14"} Oct 07 14:28:58 crc kubenswrapper[4717]: I1007 14:28:58.490627 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ntggd" podStartSLOduration=2.954266022 podStartE2EDuration="5.490607167s" podCreationTimestamp="2025-10-07 14:28:53 +0000 UTC" firstStartedPulling="2025-10-07 14:28:55.439176997 +0000 UTC m=+2117.267102789" lastFinishedPulling="2025-10-07 14:28:57.975518142 +0000 UTC m=+2119.803443934" observedRunningTime="2025-10-07 14:28:58.486751831 +0000 UTC m=+2120.314677643" watchObservedRunningTime="2025-10-07 14:28:58.490607167 +0000 UTC m=+2120.318532959" Oct 07 14:29:03 crc kubenswrapper[4717]: I1007 14:29:03.718072 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:29:03 crc kubenswrapper[4717]: I1007 14:29:03.718520 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:29:03 crc kubenswrapper[4717]: I1007 14:29:03.770474 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:29:04 crc kubenswrapper[4717]: I1007 14:29:04.559328 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:29:04 crc kubenswrapper[4717]: I1007 14:29:04.604178 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ntggd"] Oct 07 14:29:06 crc kubenswrapper[4717]: I1007 14:29:06.528570 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ntggd" podUID="e762617f-6329-4789-b11d-152bd16139ea" containerName="registry-server" containerID="cri-o://21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14" gracePeriod=2 Oct 07 14:29:06 crc kubenswrapper[4717]: I1007 14:29:06.981846 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.161697 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-catalog-content\") pod \"e762617f-6329-4789-b11d-152bd16139ea\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.161784 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzlmh\" (UniqueName: \"kubernetes.io/projected/e762617f-6329-4789-b11d-152bd16139ea-kube-api-access-jzlmh\") pod \"e762617f-6329-4789-b11d-152bd16139ea\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.161879 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-utilities\") pod \"e762617f-6329-4789-b11d-152bd16139ea\" (UID: \"e762617f-6329-4789-b11d-152bd16139ea\") " Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.162641 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-utilities" (OuterVolumeSpecName: "utilities") pod "e762617f-6329-4789-b11d-152bd16139ea" (UID: "e762617f-6329-4789-b11d-152bd16139ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.171266 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e762617f-6329-4789-b11d-152bd16139ea-kube-api-access-jzlmh" (OuterVolumeSpecName: "kube-api-access-jzlmh") pod "e762617f-6329-4789-b11d-152bd16139ea" (UID: "e762617f-6329-4789-b11d-152bd16139ea"). InnerVolumeSpecName "kube-api-access-jzlmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.209569 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e762617f-6329-4789-b11d-152bd16139ea" (UID: "e762617f-6329-4789-b11d-152bd16139ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.263838 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.263876 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzlmh\" (UniqueName: \"kubernetes.io/projected/e762617f-6329-4789-b11d-152bd16139ea-kube-api-access-jzlmh\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.263889 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e762617f-6329-4789-b11d-152bd16139ea-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.542792 4717 generic.go:334] "Generic (PLEG): container finished" podID="e762617f-6329-4789-b11d-152bd16139ea" containerID="21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14" exitCode=0 Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.542841 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntggd" event={"ID":"e762617f-6329-4789-b11d-152bd16139ea","Type":"ContainerDied","Data":"21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14"} Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.542872 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ntggd" event={"ID":"e762617f-6329-4789-b11d-152bd16139ea","Type":"ContainerDied","Data":"27c414c835cd8c92daf1c3ff5ef949cb306bd8bc4b89f79db766faa7bcd77336"} Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.542889 4717 scope.go:117] "RemoveContainer" containerID="21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.543065 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ntggd" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.571461 4717 scope.go:117] "RemoveContainer" containerID="ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.587035 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ntggd"] Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.596422 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ntggd"] Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.601090 4717 scope.go:117] "RemoveContainer" containerID="7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.636626 4717 scope.go:117] "RemoveContainer" containerID="21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14" Oct 07 14:29:07 crc kubenswrapper[4717]: E1007 14:29:07.637225 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14\": container with ID starting with 21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14 not found: ID does not exist" containerID="21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.637272 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14"} err="failed to get container status \"21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14\": rpc error: code = NotFound desc = could not find container \"21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14\": container with ID starting with 21607903a280ba6095d9700e971660a65008f2a18050511144796a8b599e8c14 not found: ID does not exist" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.637298 4717 scope.go:117] "RemoveContainer" containerID="ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1" Oct 07 14:29:07 crc kubenswrapper[4717]: E1007 14:29:07.637719 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1\": container with ID starting with ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1 not found: ID does not exist" containerID="ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.637761 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1"} err="failed to get container status \"ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1\": rpc error: code = NotFound desc = could not find container \"ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1\": container with ID starting with ddfecdd189e6b54e5e8e40a3e552f1b0745e33b4858a772151bae8914d7498f1 not found: ID does not exist" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.637790 4717 scope.go:117] "RemoveContainer" containerID="7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b" Oct 07 14:29:07 crc kubenswrapper[4717]: E1007 14:29:07.638123 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b\": container with ID starting with 7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b not found: ID does not exist" containerID="7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b" Oct 07 14:29:07 crc kubenswrapper[4717]: I1007 14:29:07.638175 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b"} err="failed to get container status \"7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b\": rpc error: code = NotFound desc = could not find container \"7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b\": container with ID starting with 7971360262c1f068c63b7d7de2635ab44db73917613bbaff0fddb270f000725b not found: ID does not exist" Oct 07 14:29:08 crc kubenswrapper[4717]: I1007 14:29:08.878822 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e762617f-6329-4789-b11d-152bd16139ea" path="/var/lib/kubelet/pods/e762617f-6329-4789-b11d-152bd16139ea/volumes" Oct 07 14:29:31 crc kubenswrapper[4717]: I1007 14:29:31.609809 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:29:31 crc kubenswrapper[4717]: I1007 14:29:31.610403 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.018147 4717 generic.go:334] "Generic (PLEG): container finished" podID="29dd3c38-62bb-4f7c-9cef-7ab420156b0c" containerID="84dc12004843bb65f4a88c5033fed820924abce5acf95851f0c1c775fc385014" exitCode=0 Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.018197 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" event={"ID":"29dd3c38-62bb-4f7c-9cef-7ab420156b0c","Type":"ContainerDied","Data":"84dc12004843bb65f4a88c5033fed820924abce5acf95851f0c1c775fc385014"} Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.142544 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh"] Oct 07 14:30:00 crc kubenswrapper[4717]: E1007 14:30:00.143041 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e762617f-6329-4789-b11d-152bd16139ea" containerName="registry-server" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.143063 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e762617f-6329-4789-b11d-152bd16139ea" containerName="registry-server" Oct 07 14:30:00 crc kubenswrapper[4717]: E1007 14:30:00.143081 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e762617f-6329-4789-b11d-152bd16139ea" containerName="extract-utilities" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.143090 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e762617f-6329-4789-b11d-152bd16139ea" containerName="extract-utilities" Oct 07 14:30:00 crc kubenswrapper[4717]: E1007 14:30:00.143108 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e762617f-6329-4789-b11d-152bd16139ea" containerName="extract-content" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.143116 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e762617f-6329-4789-b11d-152bd16139ea" containerName="extract-content" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.143415 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e762617f-6329-4789-b11d-152bd16139ea" containerName="registry-server" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.144195 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.146928 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.146942 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.154896 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh"] Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.234279 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1b5f91a-e982-4972-8886-78655079fb8d-secret-volume\") pod \"collect-profiles-29330790-m9pdh\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.234708 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rc5r\" (UniqueName: \"kubernetes.io/projected/d1b5f91a-e982-4972-8886-78655079fb8d-kube-api-access-6rc5r\") pod \"collect-profiles-29330790-m9pdh\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.234804 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1b5f91a-e982-4972-8886-78655079fb8d-config-volume\") pod \"collect-profiles-29330790-m9pdh\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.337494 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1b5f91a-e982-4972-8886-78655079fb8d-config-volume\") pod \"collect-profiles-29330790-m9pdh\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.337643 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1b5f91a-e982-4972-8886-78655079fb8d-secret-volume\") pod \"collect-profiles-29330790-m9pdh\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.337727 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rc5r\" (UniqueName: \"kubernetes.io/projected/d1b5f91a-e982-4972-8886-78655079fb8d-kube-api-access-6rc5r\") pod \"collect-profiles-29330790-m9pdh\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.338779 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1b5f91a-e982-4972-8886-78655079fb8d-config-volume\") pod \"collect-profiles-29330790-m9pdh\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.348846 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1b5f91a-e982-4972-8886-78655079fb8d-secret-volume\") pod \"collect-profiles-29330790-m9pdh\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.354592 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rc5r\" (UniqueName: \"kubernetes.io/projected/d1b5f91a-e982-4972-8886-78655079fb8d-kube-api-access-6rc5r\") pod \"collect-profiles-29330790-m9pdh\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.466252 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:00 crc kubenswrapper[4717]: I1007 14:30:00.894341 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh"] Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.029820 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" event={"ID":"d1b5f91a-e982-4972-8886-78655079fb8d","Type":"ContainerStarted","Data":"439474d842c8892317b5efd24fc72073efb10db06ca03269f3f4b8f7713aa03b"} Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.348807 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.459937 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovn-combined-ca-bundle\") pod \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.460022 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovncontroller-config-0\") pod \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.460269 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-inventory\") pod \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.460348 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnbmd\" (UniqueName: \"kubernetes.io/projected/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-kube-api-access-cnbmd\") pod \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.460414 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ssh-key\") pod \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.466791 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "29dd3c38-62bb-4f7c-9cef-7ab420156b0c" (UID: "29dd3c38-62bb-4f7c-9cef-7ab420156b0c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.469879 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-kube-api-access-cnbmd" (OuterVolumeSpecName: "kube-api-access-cnbmd") pod "29dd3c38-62bb-4f7c-9cef-7ab420156b0c" (UID: "29dd3c38-62bb-4f7c-9cef-7ab420156b0c"). InnerVolumeSpecName "kube-api-access-cnbmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.495399 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "29dd3c38-62bb-4f7c-9cef-7ab420156b0c" (UID: "29dd3c38-62bb-4f7c-9cef-7ab420156b0c"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:30:01 crc kubenswrapper[4717]: E1007 14:30:01.503353 4717 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-inventory podName:29dd3c38-62bb-4f7c-9cef-7ab420156b0c nodeName:}" failed. No retries permitted until 2025-10-07 14:30:02.003327753 +0000 UTC m=+2183.831253535 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-inventory") pod "29dd3c38-62bb-4f7c-9cef-7ab420156b0c" (UID: "29dd3c38-62bb-4f7c-9cef-7ab420156b0c") : error deleting /var/lib/kubelet/pods/29dd3c38-62bb-4f7c-9cef-7ab420156b0c/volume-subpaths: remove /var/lib/kubelet/pods/29dd3c38-62bb-4f7c-9cef-7ab420156b0c/volume-subpaths: no such file or directory Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.507107 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29dd3c38-62bb-4f7c-9cef-7ab420156b0c" (UID: "29dd3c38-62bb-4f7c-9cef-7ab420156b0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.563256 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnbmd\" (UniqueName: \"kubernetes.io/projected/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-kube-api-access-cnbmd\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.563297 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.563310 4717 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.563322 4717 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.609730 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:30:01 crc kubenswrapper[4717]: I1007 14:30:01.609991 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.046495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" event={"ID":"29dd3c38-62bb-4f7c-9cef-7ab420156b0c","Type":"ContainerDied","Data":"471ecd900a9a7b794353b3cf536817bf1ecc3d1418f5d736c2b933efec2477d8"} Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.046565 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="471ecd900a9a7b794353b3cf536817bf1ecc3d1418f5d736c2b933efec2477d8" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.046503 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-xgpx2" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.052696 4717 generic.go:334] "Generic (PLEG): container finished" podID="d1b5f91a-e982-4972-8886-78655079fb8d" containerID="f2211b45882616aabaa640ab8c1ffefc19cb940f7a3bc695ccb94e11df9f5044" exitCode=0 Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.052810 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" event={"ID":"d1b5f91a-e982-4972-8886-78655079fb8d","Type":"ContainerDied","Data":"f2211b45882616aabaa640ab8c1ffefc19cb940f7a3bc695ccb94e11df9f5044"} Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.072690 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-inventory\") pod \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\" (UID: \"29dd3c38-62bb-4f7c-9cef-7ab420156b0c\") " Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.082244 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-inventory" (OuterVolumeSpecName: "inventory") pod "29dd3c38-62bb-4f7c-9cef-7ab420156b0c" (UID: "29dd3c38-62bb-4f7c-9cef-7ab420156b0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.142269 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w"] Oct 07 14:30:02 crc kubenswrapper[4717]: E1007 14:30:02.151468 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29dd3c38-62bb-4f7c-9cef-7ab420156b0c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.151536 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="29dd3c38-62bb-4f7c-9cef-7ab420156b0c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.154834 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="29dd3c38-62bb-4f7c-9cef-7ab420156b0c" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.157848 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.163429 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.164434 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.194298 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w"] Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.194394 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29dd3c38-62bb-4f7c-9cef-7ab420156b0c-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.295600 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.295954 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.295985 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.296247 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.296377 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krfsl\" (UniqueName: \"kubernetes.io/projected/310bfdcc-9c71-4075-b8d3-af7c21dc3165-kube-api-access-krfsl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.296482 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.398524 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.398589 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krfsl\" (UniqueName: \"kubernetes.io/projected/310bfdcc-9c71-4075-b8d3-af7c21dc3165-kube-api-access-krfsl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.398634 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.398722 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.398786 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.398821 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.402699 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.412856 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.412988 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.413298 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.413474 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.416427 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krfsl\" (UniqueName: \"kubernetes.io/projected/310bfdcc-9c71-4075-b8d3-af7c21dc3165-kube-api-access-krfsl\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:02 crc kubenswrapper[4717]: I1007 14:30:02.491241 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.009738 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w"] Oct 07 14:30:03 crc kubenswrapper[4717]: W1007 14:30:03.014938 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod310bfdcc_9c71_4075_b8d3_af7c21dc3165.slice/crio-822a43aab23196c229739e11d1816c2703904f510dd0726cf65f0deab21a4b60 WatchSource:0}: Error finding container 822a43aab23196c229739e11d1816c2703904f510dd0726cf65f0deab21a4b60: Status 404 returned error can't find the container with id 822a43aab23196c229739e11d1816c2703904f510dd0726cf65f0deab21a4b60 Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.064515 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" event={"ID":"310bfdcc-9c71-4075-b8d3-af7c21dc3165","Type":"ContainerStarted","Data":"822a43aab23196c229739e11d1816c2703904f510dd0726cf65f0deab21a4b60"} Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.297891 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.428406 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1b5f91a-e982-4972-8886-78655079fb8d-config-volume\") pod \"d1b5f91a-e982-4972-8886-78655079fb8d\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.428543 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rc5r\" (UniqueName: \"kubernetes.io/projected/d1b5f91a-e982-4972-8886-78655079fb8d-kube-api-access-6rc5r\") pod \"d1b5f91a-e982-4972-8886-78655079fb8d\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.428581 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1b5f91a-e982-4972-8886-78655079fb8d-secret-volume\") pod \"d1b5f91a-e982-4972-8886-78655079fb8d\" (UID: \"d1b5f91a-e982-4972-8886-78655079fb8d\") " Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.431400 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b5f91a-e982-4972-8886-78655079fb8d-config-volume" (OuterVolumeSpecName: "config-volume") pod "d1b5f91a-e982-4972-8886-78655079fb8d" (UID: "d1b5f91a-e982-4972-8886-78655079fb8d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.435466 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b5f91a-e982-4972-8886-78655079fb8d-kube-api-access-6rc5r" (OuterVolumeSpecName: "kube-api-access-6rc5r") pod "d1b5f91a-e982-4972-8886-78655079fb8d" (UID: "d1b5f91a-e982-4972-8886-78655079fb8d"). InnerVolumeSpecName "kube-api-access-6rc5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.435506 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b5f91a-e982-4972-8886-78655079fb8d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d1b5f91a-e982-4972-8886-78655079fb8d" (UID: "d1b5f91a-e982-4972-8886-78655079fb8d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.531418 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d1b5f91a-e982-4972-8886-78655079fb8d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.531650 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rc5r\" (UniqueName: \"kubernetes.io/projected/d1b5f91a-e982-4972-8886-78655079fb8d-kube-api-access-6rc5r\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:03 crc kubenswrapper[4717]: I1007 14:30:03.531661 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d1b5f91a-e982-4972-8886-78655079fb8d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:04 crc kubenswrapper[4717]: I1007 14:30:04.075601 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" event={"ID":"d1b5f91a-e982-4972-8886-78655079fb8d","Type":"ContainerDied","Data":"439474d842c8892317b5efd24fc72073efb10db06ca03269f3f4b8f7713aa03b"} Oct 07 14:30:04 crc kubenswrapper[4717]: I1007 14:30:04.075646 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="439474d842c8892317b5efd24fc72073efb10db06ca03269f3f4b8f7713aa03b" Oct 07 14:30:04 crc kubenswrapper[4717]: I1007 14:30:04.075700 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh" Oct 07 14:30:04 crc kubenswrapper[4717]: I1007 14:30:04.384440 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq"] Oct 07 14:30:04 crc kubenswrapper[4717]: I1007 14:30:04.393691 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330745-kpzmq"] Oct 07 14:30:04 crc kubenswrapper[4717]: I1007 14:30:04.885080 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37c53362-c687-4727-97b9-95a9a358cf5b" path="/var/lib/kubelet/pods/37c53362-c687-4727-97b9-95a9a358cf5b/volumes" Oct 07 14:30:05 crc kubenswrapper[4717]: I1007 14:30:05.088712 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" event={"ID":"310bfdcc-9c71-4075-b8d3-af7c21dc3165","Type":"ContainerStarted","Data":"416d70926c8d9235ab8fcbc2599562636ddb8c0fd0bf2f9b03570dca0c69653c"} Oct 07 14:30:05 crc kubenswrapper[4717]: I1007 14:30:05.113203 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" podStartSLOduration=2.148583083 podStartE2EDuration="3.113182561s" podCreationTimestamp="2025-10-07 14:30:02 +0000 UTC" firstStartedPulling="2025-10-07 14:30:03.018660721 +0000 UTC m=+2184.846586513" lastFinishedPulling="2025-10-07 14:30:03.983260209 +0000 UTC m=+2185.811185991" observedRunningTime="2025-10-07 14:30:05.10659734 +0000 UTC m=+2186.934523132" watchObservedRunningTime="2025-10-07 14:30:05.113182561 +0000 UTC m=+2186.941108363" Oct 07 14:30:31 crc kubenswrapper[4717]: I1007 14:30:31.609801 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:30:31 crc kubenswrapper[4717]: I1007 14:30:31.610410 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:30:31 crc kubenswrapper[4717]: I1007 14:30:31.610457 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:30:31 crc kubenswrapper[4717]: I1007 14:30:31.611168 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:30:31 crc kubenswrapper[4717]: I1007 14:30:31.611228 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" gracePeriod=600 Oct 07 14:30:32 crc kubenswrapper[4717]: I1007 14:30:32.329047 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" exitCode=0 Oct 07 14:30:32 crc kubenswrapper[4717]: I1007 14:30:32.329126 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d"} Oct 07 14:30:32 crc kubenswrapper[4717]: I1007 14:30:32.329462 4717 scope.go:117] "RemoveContainer" containerID="e96337bcb79192f8c77c0f06850a719e98effc004e5bae83edcb61766e65fad7" Oct 07 14:30:32 crc kubenswrapper[4717]: E1007 14:30:32.416806 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:30:33 crc kubenswrapper[4717]: I1007 14:30:33.342085 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:30:33 crc kubenswrapper[4717]: E1007 14:30:33.342467 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:30:44 crc kubenswrapper[4717]: I1007 14:30:44.060724 4717 scope.go:117] "RemoveContainer" containerID="8b01b86cce3843dc7fc1f14264207948424bd64f2051d7f85e0867b3cd4fed82" Oct 07 14:30:47 crc kubenswrapper[4717]: I1007 14:30:47.868708 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:30:47 crc kubenswrapper[4717]: E1007 14:30:47.869488 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:30:55 crc kubenswrapper[4717]: I1007 14:30:55.533393 4717 generic.go:334] "Generic (PLEG): container finished" podID="310bfdcc-9c71-4075-b8d3-af7c21dc3165" containerID="416d70926c8d9235ab8fcbc2599562636ddb8c0fd0bf2f9b03570dca0c69653c" exitCode=0 Oct 07 14:30:55 crc kubenswrapper[4717]: I1007 14:30:55.533492 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" event={"ID":"310bfdcc-9c71-4075-b8d3-af7c21dc3165","Type":"ContainerDied","Data":"416d70926c8d9235ab8fcbc2599562636ddb8c0fd0bf2f9b03570dca0c69653c"} Oct 07 14:30:56 crc kubenswrapper[4717]: I1007 14:30:56.994382 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.123804 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krfsl\" (UniqueName: \"kubernetes.io/projected/310bfdcc-9c71-4075-b8d3-af7c21dc3165-kube-api-access-krfsl\") pod \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.124503 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-ssh-key\") pod \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.124570 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-ovn-metadata-agent-neutron-config-0\") pod \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.124625 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-nova-metadata-neutron-config-0\") pod \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.124656 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-metadata-combined-ca-bundle\") pod \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.124705 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-inventory\") pod \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\" (UID: \"310bfdcc-9c71-4075-b8d3-af7c21dc3165\") " Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.130805 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "310bfdcc-9c71-4075-b8d3-af7c21dc3165" (UID: "310bfdcc-9c71-4075-b8d3-af7c21dc3165"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.148295 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310bfdcc-9c71-4075-b8d3-af7c21dc3165-kube-api-access-krfsl" (OuterVolumeSpecName: "kube-api-access-krfsl") pod "310bfdcc-9c71-4075-b8d3-af7c21dc3165" (UID: "310bfdcc-9c71-4075-b8d3-af7c21dc3165"). InnerVolumeSpecName "kube-api-access-krfsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.152893 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "310bfdcc-9c71-4075-b8d3-af7c21dc3165" (UID: "310bfdcc-9c71-4075-b8d3-af7c21dc3165"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.155088 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "310bfdcc-9c71-4075-b8d3-af7c21dc3165" (UID: "310bfdcc-9c71-4075-b8d3-af7c21dc3165"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.158470 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-inventory" (OuterVolumeSpecName: "inventory") pod "310bfdcc-9c71-4075-b8d3-af7c21dc3165" (UID: "310bfdcc-9c71-4075-b8d3-af7c21dc3165"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.169392 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "310bfdcc-9c71-4075-b8d3-af7c21dc3165" (UID: "310bfdcc-9c71-4075-b8d3-af7c21dc3165"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.227193 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krfsl\" (UniqueName: \"kubernetes.io/projected/310bfdcc-9c71-4075-b8d3-af7c21dc3165-kube-api-access-krfsl\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.227233 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.227245 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.227259 4717 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.227269 4717 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.227279 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/310bfdcc-9c71-4075-b8d3-af7c21dc3165-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.557486 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" event={"ID":"310bfdcc-9c71-4075-b8d3-af7c21dc3165","Type":"ContainerDied","Data":"822a43aab23196c229739e11d1816c2703904f510dd0726cf65f0deab21a4b60"} Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.557533 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.557538 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="822a43aab23196c229739e11d1816c2703904f510dd0726cf65f0deab21a4b60" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.651197 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz"] Oct 07 14:30:57 crc kubenswrapper[4717]: E1007 14:30:57.651725 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310bfdcc-9c71-4075-b8d3-af7c21dc3165" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.651752 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="310bfdcc-9c71-4075-b8d3-af7c21dc3165" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 14:30:57 crc kubenswrapper[4717]: E1007 14:30:57.651787 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b5f91a-e982-4972-8886-78655079fb8d" containerName="collect-profiles" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.651796 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b5f91a-e982-4972-8886-78655079fb8d" containerName="collect-profiles" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.652087 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b5f91a-e982-4972-8886-78655079fb8d" containerName="collect-profiles" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.652112 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="310bfdcc-9c71-4075-b8d3-af7c21dc3165" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.652925 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.655272 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.655835 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.655913 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.656682 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.656692 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.659469 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz"] Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.736284 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvhcm\" (UniqueName: \"kubernetes.io/projected/73f6a900-f08d-4207-b89d-d8acfd404b8d-kube-api-access-fvhcm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.736339 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.736402 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.736419 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.736583 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.838259 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvhcm\" (UniqueName: \"kubernetes.io/projected/73f6a900-f08d-4207-b89d-d8acfd404b8d-kube-api-access-fvhcm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.838323 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.838388 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.838402 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.838429 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.842896 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.846931 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.847703 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.848367 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.856956 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvhcm\" (UniqueName: \"kubernetes.io/projected/73f6a900-f08d-4207-b89d-d8acfd404b8d-kube-api-access-fvhcm\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:57 crc kubenswrapper[4717]: I1007 14:30:57.977979 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:30:58 crc kubenswrapper[4717]: I1007 14:30:58.478641 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz"] Oct 07 14:30:58 crc kubenswrapper[4717]: I1007 14:30:58.566923 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" event={"ID":"73f6a900-f08d-4207-b89d-d8acfd404b8d","Type":"ContainerStarted","Data":"4ecab9de5a62342e763f2ab8ad17def7047f258c28be243671481cab905b433b"} Oct 07 14:30:59 crc kubenswrapper[4717]: I1007 14:30:59.882899 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:30:59 crc kubenswrapper[4717]: E1007 14:30:59.888265 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:31:00 crc kubenswrapper[4717]: I1007 14:31:00.587333 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" event={"ID":"73f6a900-f08d-4207-b89d-d8acfd404b8d","Type":"ContainerStarted","Data":"c45f68a4714035bc278769ed4253b892ff56baeecc60a5630d463cbe94711a0b"} Oct 07 14:31:00 crc kubenswrapper[4717]: I1007 14:31:00.615804 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" podStartSLOduration=2.37298611 podStartE2EDuration="3.615783732s" podCreationTimestamp="2025-10-07 14:30:57 +0000 UTC" firstStartedPulling="2025-10-07 14:30:58.48498774 +0000 UTC m=+2240.312913532" lastFinishedPulling="2025-10-07 14:30:59.727785362 +0000 UTC m=+2241.555711154" observedRunningTime="2025-10-07 14:31:00.609982133 +0000 UTC m=+2242.437907935" watchObservedRunningTime="2025-10-07 14:31:00.615783732 +0000 UTC m=+2242.443709524" Oct 07 14:31:12 crc kubenswrapper[4717]: I1007 14:31:12.868314 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:31:12 crc kubenswrapper[4717]: E1007 14:31:12.868996 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:31:25 crc kubenswrapper[4717]: I1007 14:31:25.869196 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:31:25 crc kubenswrapper[4717]: E1007 14:31:25.871834 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:31:36 crc kubenswrapper[4717]: I1007 14:31:36.869532 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:31:36 crc kubenswrapper[4717]: E1007 14:31:36.870418 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:31:49 crc kubenswrapper[4717]: I1007 14:31:49.868897 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:31:49 crc kubenswrapper[4717]: E1007 14:31:49.870080 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:32:04 crc kubenswrapper[4717]: I1007 14:32:04.868733 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:32:04 crc kubenswrapper[4717]: E1007 14:32:04.869711 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:32:15 crc kubenswrapper[4717]: I1007 14:32:15.869535 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:32:15 crc kubenswrapper[4717]: E1007 14:32:15.872630 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:32:27 crc kubenswrapper[4717]: I1007 14:32:27.869884 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:32:27 crc kubenswrapper[4717]: E1007 14:32:27.871247 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:32:42 crc kubenswrapper[4717]: I1007 14:32:42.868983 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:32:42 crc kubenswrapper[4717]: E1007 14:32:42.869681 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:32:54 crc kubenswrapper[4717]: I1007 14:32:54.869388 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:32:54 crc kubenswrapper[4717]: E1007 14:32:54.870483 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:33:06 crc kubenswrapper[4717]: I1007 14:33:06.868494 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:33:06 crc kubenswrapper[4717]: E1007 14:33:06.869233 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:33:20 crc kubenswrapper[4717]: I1007 14:33:20.868591 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:33:20 crc kubenswrapper[4717]: E1007 14:33:20.869509 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:33:33 crc kubenswrapper[4717]: I1007 14:33:33.868412 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:33:33 crc kubenswrapper[4717]: E1007 14:33:33.869332 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:33:45 crc kubenswrapper[4717]: I1007 14:33:45.868621 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:33:45 crc kubenswrapper[4717]: E1007 14:33:45.869623 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:33:58 crc kubenswrapper[4717]: I1007 14:33:58.874529 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:33:58 crc kubenswrapper[4717]: E1007 14:33:58.875309 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:34:10 crc kubenswrapper[4717]: I1007 14:34:10.876784 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:34:10 crc kubenswrapper[4717]: E1007 14:34:10.877915 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:34:25 crc kubenswrapper[4717]: I1007 14:34:25.868340 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:34:25 crc kubenswrapper[4717]: E1007 14:34:25.869097 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:34:39 crc kubenswrapper[4717]: I1007 14:34:39.868719 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:34:39 crc kubenswrapper[4717]: E1007 14:34:39.869345 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:34:53 crc kubenswrapper[4717]: I1007 14:34:53.868991 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:34:53 crc kubenswrapper[4717]: E1007 14:34:53.869734 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:35:08 crc kubenswrapper[4717]: I1007 14:35:08.874374 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:35:08 crc kubenswrapper[4717]: E1007 14:35:08.875288 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:35:14 crc kubenswrapper[4717]: I1007 14:35:14.834082 4717 generic.go:334] "Generic (PLEG): container finished" podID="73f6a900-f08d-4207-b89d-d8acfd404b8d" containerID="c45f68a4714035bc278769ed4253b892ff56baeecc60a5630d463cbe94711a0b" exitCode=0 Oct 07 14:35:14 crc kubenswrapper[4717]: I1007 14:35:14.834175 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" event={"ID":"73f6a900-f08d-4207-b89d-d8acfd404b8d","Type":"ContainerDied","Data":"c45f68a4714035bc278769ed4253b892ff56baeecc60a5630d463cbe94711a0b"} Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.279739 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.353522 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvhcm\" (UniqueName: \"kubernetes.io/projected/73f6a900-f08d-4207-b89d-d8acfd404b8d-kube-api-access-fvhcm\") pod \"73f6a900-f08d-4207-b89d-d8acfd404b8d\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.353600 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-combined-ca-bundle\") pod \"73f6a900-f08d-4207-b89d-d8acfd404b8d\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.353625 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-ssh-key\") pod \"73f6a900-f08d-4207-b89d-d8acfd404b8d\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.353743 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-inventory\") pod \"73f6a900-f08d-4207-b89d-d8acfd404b8d\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.353815 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-secret-0\") pod \"73f6a900-f08d-4207-b89d-d8acfd404b8d\" (UID: \"73f6a900-f08d-4207-b89d-d8acfd404b8d\") " Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.369479 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73f6a900-f08d-4207-b89d-d8acfd404b8d-kube-api-access-fvhcm" (OuterVolumeSpecName: "kube-api-access-fvhcm") pod "73f6a900-f08d-4207-b89d-d8acfd404b8d" (UID: "73f6a900-f08d-4207-b89d-d8acfd404b8d"). InnerVolumeSpecName "kube-api-access-fvhcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.372160 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "73f6a900-f08d-4207-b89d-d8acfd404b8d" (UID: "73f6a900-f08d-4207-b89d-d8acfd404b8d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.387146 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-inventory" (OuterVolumeSpecName: "inventory") pod "73f6a900-f08d-4207-b89d-d8acfd404b8d" (UID: "73f6a900-f08d-4207-b89d-d8acfd404b8d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.392615 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "73f6a900-f08d-4207-b89d-d8acfd404b8d" (UID: "73f6a900-f08d-4207-b89d-d8acfd404b8d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.394534 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "73f6a900-f08d-4207-b89d-d8acfd404b8d" (UID: "73f6a900-f08d-4207-b89d-d8acfd404b8d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.455689 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.455834 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.455919 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvhcm\" (UniqueName: \"kubernetes.io/projected/73f6a900-f08d-4207-b89d-d8acfd404b8d-kube-api-access-fvhcm\") on node \"crc\" DevicePath \"\"" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.455980 4717 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.456063 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/73f6a900-f08d-4207-b89d-d8acfd404b8d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.851930 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" event={"ID":"73f6a900-f08d-4207-b89d-d8acfd404b8d","Type":"ContainerDied","Data":"4ecab9de5a62342e763f2ab8ad17def7047f258c28be243671481cab905b433b"} Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.852211 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ecab9de5a62342e763f2ab8ad17def7047f258c28be243671481cab905b433b" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.852045 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.948781 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx"] Oct 07 14:35:16 crc kubenswrapper[4717]: E1007 14:35:16.949609 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73f6a900-f08d-4207-b89d-d8acfd404b8d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.949636 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="73f6a900-f08d-4207-b89d-d8acfd404b8d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.950119 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="73f6a900-f08d-4207-b89d-d8acfd404b8d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.950860 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.954437 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.954546 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.954665 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.954734 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.954813 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.954899 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.956219 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:35:16 crc kubenswrapper[4717]: I1007 14:35:16.967155 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx"] Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.067072 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.067134 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.067171 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.067259 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.067296 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggnpd\" (UniqueName: \"kubernetes.io/projected/44a1daee-eac0-4c51-ae29-1afa919bcb68-kube-api-access-ggnpd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.067314 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.067523 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.067568 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.067745 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.170128 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.170184 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.170239 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.170285 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.170321 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.171071 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.171173 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.171212 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggnpd\" (UniqueName: \"kubernetes.io/projected/44a1daee-eac0-4c51-ae29-1afa919bcb68-kube-api-access-ggnpd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.171230 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.176058 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.180831 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.181645 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.181666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.187980 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.197681 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.205834 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggnpd\" (UniqueName: \"kubernetes.io/projected/44a1daee-eac0-4c51-ae29-1afa919bcb68-kube-api-access-ggnpd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.206432 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.218716 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xxwpx\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.275753 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.789216 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx"] Oct 07 14:35:17 crc kubenswrapper[4717]: W1007 14:35:17.796858 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44a1daee_eac0_4c51_ae29_1afa919bcb68.slice/crio-3231075ad738686a4a270fc50139cd6f54874f71cfae03d60e040b8f6ab6e235 WatchSource:0}: Error finding container 3231075ad738686a4a270fc50139cd6f54874f71cfae03d60e040b8f6ab6e235: Status 404 returned error can't find the container with id 3231075ad738686a4a270fc50139cd6f54874f71cfae03d60e040b8f6ab6e235 Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.800692 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:35:17 crc kubenswrapper[4717]: I1007 14:35:17.860301 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" event={"ID":"44a1daee-eac0-4c51-ae29-1afa919bcb68","Type":"ContainerStarted","Data":"3231075ad738686a4a270fc50139cd6f54874f71cfae03d60e040b8f6ab6e235"} Oct 07 14:35:19 crc kubenswrapper[4717]: I1007 14:35:19.869045 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:35:19 crc kubenswrapper[4717]: E1007 14:35:19.869730 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:35:19 crc kubenswrapper[4717]: I1007 14:35:19.878549 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" event={"ID":"44a1daee-eac0-4c51-ae29-1afa919bcb68","Type":"ContainerStarted","Data":"f5bf05bf51c21cce84063e9ddae805b32897289b3ec685b38c90a79ef921e1ea"} Oct 07 14:35:31 crc kubenswrapper[4717]: I1007 14:35:31.868560 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:35:32 crc kubenswrapper[4717]: I1007 14:35:32.992466 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"5b5cd7afab4830bbc2ddc68beae74171a9b0689b09e6debe85258eea971f5924"} Oct 07 14:35:33 crc kubenswrapper[4717]: I1007 14:35:33.019353 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" podStartSLOduration=15.979944072 podStartE2EDuration="17.019334683s" podCreationTimestamp="2025-10-07 14:35:16 +0000 UTC" firstStartedPulling="2025-10-07 14:35:17.800254879 +0000 UTC m=+2499.628180671" lastFinishedPulling="2025-10-07 14:35:18.83964549 +0000 UTC m=+2500.667571282" observedRunningTime="2025-10-07 14:35:19.89566044 +0000 UTC m=+2501.723586232" watchObservedRunningTime="2025-10-07 14:35:33.019334683 +0000 UTC m=+2514.847260465" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.305711 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9dcl2"] Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.308793 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.320267 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dcl2"] Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.412261 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56tc2\" (UniqueName: \"kubernetes.io/projected/155b69de-d591-4be8-a359-d0a647eb01f0-kube-api-access-56tc2\") pod \"redhat-marketplace-9dcl2\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.412327 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-catalog-content\") pod \"redhat-marketplace-9dcl2\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.412475 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-utilities\") pod \"redhat-marketplace-9dcl2\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.514354 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-utilities\") pod \"redhat-marketplace-9dcl2\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.514509 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56tc2\" (UniqueName: \"kubernetes.io/projected/155b69de-d591-4be8-a359-d0a647eb01f0-kube-api-access-56tc2\") pod \"redhat-marketplace-9dcl2\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.514550 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-catalog-content\") pod \"redhat-marketplace-9dcl2\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.514884 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-utilities\") pod \"redhat-marketplace-9dcl2\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.515306 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-catalog-content\") pod \"redhat-marketplace-9dcl2\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.539118 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56tc2\" (UniqueName: \"kubernetes.io/projected/155b69de-d591-4be8-a359-d0a647eb01f0-kube-api-access-56tc2\") pod \"redhat-marketplace-9dcl2\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:10 crc kubenswrapper[4717]: I1007 14:37:10.631439 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:11 crc kubenswrapper[4717]: I1007 14:37:11.070608 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dcl2"] Oct 07 14:37:11 crc kubenswrapper[4717]: I1007 14:37:11.898709 4717 generic.go:334] "Generic (PLEG): container finished" podID="155b69de-d591-4be8-a359-d0a647eb01f0" containerID="10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b" exitCode=0 Oct 07 14:37:11 crc kubenswrapper[4717]: I1007 14:37:11.898751 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dcl2" event={"ID":"155b69de-d591-4be8-a359-d0a647eb01f0","Type":"ContainerDied","Data":"10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b"} Oct 07 14:37:11 crc kubenswrapper[4717]: I1007 14:37:11.898774 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dcl2" event={"ID":"155b69de-d591-4be8-a359-d0a647eb01f0","Type":"ContainerStarted","Data":"b931855c66a640254c70a2e56dc5d125bd7b6271a2e3f96a47c7fad7671d4bf7"} Oct 07 14:37:13 crc kubenswrapper[4717]: I1007 14:37:13.918824 4717 generic.go:334] "Generic (PLEG): container finished" podID="155b69de-d591-4be8-a359-d0a647eb01f0" containerID="43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c" exitCode=0 Oct 07 14:37:13 crc kubenswrapper[4717]: I1007 14:37:13.918919 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dcl2" event={"ID":"155b69de-d591-4be8-a359-d0a647eb01f0","Type":"ContainerDied","Data":"43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c"} Oct 07 14:37:14 crc kubenswrapper[4717]: I1007 14:37:14.930967 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dcl2" event={"ID":"155b69de-d591-4be8-a359-d0a647eb01f0","Type":"ContainerStarted","Data":"b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9"} Oct 07 14:37:14 crc kubenswrapper[4717]: I1007 14:37:14.954576 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9dcl2" podStartSLOduration=2.470483515 podStartE2EDuration="4.954558597s" podCreationTimestamp="2025-10-07 14:37:10 +0000 UTC" firstStartedPulling="2025-10-07 14:37:11.900873511 +0000 UTC m=+2613.728799303" lastFinishedPulling="2025-10-07 14:37:14.384948593 +0000 UTC m=+2616.212874385" observedRunningTime="2025-10-07 14:37:14.945389898 +0000 UTC m=+2616.773315690" watchObservedRunningTime="2025-10-07 14:37:14.954558597 +0000 UTC m=+2616.782484389" Oct 07 14:37:20 crc kubenswrapper[4717]: I1007 14:37:20.631669 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:20 crc kubenswrapper[4717]: I1007 14:37:20.632241 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:20 crc kubenswrapper[4717]: I1007 14:37:20.673775 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:21 crc kubenswrapper[4717]: I1007 14:37:21.033594 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:21 crc kubenswrapper[4717]: I1007 14:37:21.073791 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dcl2"] Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.005914 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9dcl2" podUID="155b69de-d591-4be8-a359-d0a647eb01f0" containerName="registry-server" containerID="cri-o://b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9" gracePeriod=2 Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.418440 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.581945 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56tc2\" (UniqueName: \"kubernetes.io/projected/155b69de-d591-4be8-a359-d0a647eb01f0-kube-api-access-56tc2\") pod \"155b69de-d591-4be8-a359-d0a647eb01f0\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.582069 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-catalog-content\") pod \"155b69de-d591-4be8-a359-d0a647eb01f0\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.582171 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-utilities\") pod \"155b69de-d591-4be8-a359-d0a647eb01f0\" (UID: \"155b69de-d591-4be8-a359-d0a647eb01f0\") " Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.582943 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-utilities" (OuterVolumeSpecName: "utilities") pod "155b69de-d591-4be8-a359-d0a647eb01f0" (UID: "155b69de-d591-4be8-a359-d0a647eb01f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.587110 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/155b69de-d591-4be8-a359-d0a647eb01f0-kube-api-access-56tc2" (OuterVolumeSpecName: "kube-api-access-56tc2") pod "155b69de-d591-4be8-a359-d0a647eb01f0" (UID: "155b69de-d591-4be8-a359-d0a647eb01f0"). InnerVolumeSpecName "kube-api-access-56tc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.595065 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "155b69de-d591-4be8-a359-d0a647eb01f0" (UID: "155b69de-d591-4be8-a359-d0a647eb01f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.684148 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56tc2\" (UniqueName: \"kubernetes.io/projected/155b69de-d591-4be8-a359-d0a647eb01f0-kube-api-access-56tc2\") on node \"crc\" DevicePath \"\"" Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.684187 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:37:23 crc kubenswrapper[4717]: I1007 14:37:23.684200 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/155b69de-d591-4be8-a359-d0a647eb01f0-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.020114 4717 generic.go:334] "Generic (PLEG): container finished" podID="155b69de-d591-4be8-a359-d0a647eb01f0" containerID="b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9" exitCode=0 Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.020172 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dcl2" event={"ID":"155b69de-d591-4be8-a359-d0a647eb01f0","Type":"ContainerDied","Data":"b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9"} Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.020212 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dcl2" event={"ID":"155b69de-d591-4be8-a359-d0a647eb01f0","Type":"ContainerDied","Data":"b931855c66a640254c70a2e56dc5d125bd7b6271a2e3f96a47c7fad7671d4bf7"} Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.020261 4717 scope.go:117] "RemoveContainer" containerID="b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.020343 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dcl2" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.066742 4717 scope.go:117] "RemoveContainer" containerID="43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.075581 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dcl2"] Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.085640 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dcl2"] Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.106077 4717 scope.go:117] "RemoveContainer" containerID="10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.148949 4717 scope.go:117] "RemoveContainer" containerID="b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9" Oct 07 14:37:24 crc kubenswrapper[4717]: E1007 14:37:24.149506 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9\": container with ID starting with b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9 not found: ID does not exist" containerID="b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.149550 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9"} err="failed to get container status \"b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9\": rpc error: code = NotFound desc = could not find container \"b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9\": container with ID starting with b66ece18f8bd291afff39e3c10a4c13e163770453e285da32718245858efc0e9 not found: ID does not exist" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.149572 4717 scope.go:117] "RemoveContainer" containerID="43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c" Oct 07 14:37:24 crc kubenswrapper[4717]: E1007 14:37:24.150307 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c\": container with ID starting with 43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c not found: ID does not exist" containerID="43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.150396 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c"} err="failed to get container status \"43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c\": rpc error: code = NotFound desc = could not find container \"43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c\": container with ID starting with 43bd477d30ae4bf0efb4b2c5ea5ed58fd352a47f4b06535f47ca4ee6149b363c not found: ID does not exist" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.150444 4717 scope.go:117] "RemoveContainer" containerID="10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b" Oct 07 14:37:24 crc kubenswrapper[4717]: E1007 14:37:24.151102 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b\": container with ID starting with 10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b not found: ID does not exist" containerID="10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.151162 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b"} err="failed to get container status \"10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b\": rpc error: code = NotFound desc = could not find container \"10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b\": container with ID starting with 10b43aea79d9f91c5675c28bb38d370f2d2c0008eccb2d35a87bbf7e14d1105b not found: ID does not exist" Oct 07 14:37:24 crc kubenswrapper[4717]: I1007 14:37:24.900707 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="155b69de-d591-4be8-a359-d0a647eb01f0" path="/var/lib/kubelet/pods/155b69de-d591-4be8-a359-d0a647eb01f0/volumes" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.631993 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dbhbb"] Oct 07 14:37:37 crc kubenswrapper[4717]: E1007 14:37:37.632851 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155b69de-d591-4be8-a359-d0a647eb01f0" containerName="extract-utilities" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.632864 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="155b69de-d591-4be8-a359-d0a647eb01f0" containerName="extract-utilities" Oct 07 14:37:37 crc kubenswrapper[4717]: E1007 14:37:37.632895 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155b69de-d591-4be8-a359-d0a647eb01f0" containerName="extract-content" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.632902 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="155b69de-d591-4be8-a359-d0a647eb01f0" containerName="extract-content" Oct 07 14:37:37 crc kubenswrapper[4717]: E1007 14:37:37.632930 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155b69de-d591-4be8-a359-d0a647eb01f0" containerName="registry-server" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.632936 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="155b69de-d591-4be8-a359-d0a647eb01f0" containerName="registry-server" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.633146 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="155b69de-d591-4be8-a359-d0a647eb01f0" containerName="registry-server" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.634512 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.642601 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dbhbb"] Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.686898 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-catalog-content\") pod \"redhat-operators-dbhbb\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.686965 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdpkd\" (UniqueName: \"kubernetes.io/projected/154135f7-3c97-49e5-a899-6e141275db60-kube-api-access-zdpkd\") pod \"redhat-operators-dbhbb\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.687265 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-utilities\") pod \"redhat-operators-dbhbb\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.789825 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-catalog-content\") pod \"redhat-operators-dbhbb\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.789893 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdpkd\" (UniqueName: \"kubernetes.io/projected/154135f7-3c97-49e5-a899-6e141275db60-kube-api-access-zdpkd\") pod \"redhat-operators-dbhbb\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.789980 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-utilities\") pod \"redhat-operators-dbhbb\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.790412 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-catalog-content\") pod \"redhat-operators-dbhbb\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.790465 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-utilities\") pod \"redhat-operators-dbhbb\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.812166 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdpkd\" (UniqueName: \"kubernetes.io/projected/154135f7-3c97-49e5-a899-6e141275db60-kube-api-access-zdpkd\") pod \"redhat-operators-dbhbb\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:37 crc kubenswrapper[4717]: I1007 14:37:37.955732 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:38 crc kubenswrapper[4717]: I1007 14:37:38.344957 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dbhbb"] Oct 07 14:37:39 crc kubenswrapper[4717]: I1007 14:37:39.171159 4717 generic.go:334] "Generic (PLEG): container finished" podID="154135f7-3c97-49e5-a899-6e141275db60" containerID="203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57" exitCode=0 Oct 07 14:37:39 crc kubenswrapper[4717]: I1007 14:37:39.171276 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbhbb" event={"ID":"154135f7-3c97-49e5-a899-6e141275db60","Type":"ContainerDied","Data":"203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57"} Oct 07 14:37:39 crc kubenswrapper[4717]: I1007 14:37:39.171760 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbhbb" event={"ID":"154135f7-3c97-49e5-a899-6e141275db60","Type":"ContainerStarted","Data":"df3cd755d48d7037cfca6fa2ce50b00e8bc6050b6e602eecd12a1fc66df51535"} Oct 07 14:37:41 crc kubenswrapper[4717]: I1007 14:37:41.189612 4717 generic.go:334] "Generic (PLEG): container finished" podID="154135f7-3c97-49e5-a899-6e141275db60" containerID="1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf" exitCode=0 Oct 07 14:37:41 crc kubenswrapper[4717]: I1007 14:37:41.189664 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbhbb" event={"ID":"154135f7-3c97-49e5-a899-6e141275db60","Type":"ContainerDied","Data":"1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf"} Oct 07 14:37:42 crc kubenswrapper[4717]: I1007 14:37:42.203650 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbhbb" event={"ID":"154135f7-3c97-49e5-a899-6e141275db60","Type":"ContainerStarted","Data":"233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8"} Oct 07 14:37:42 crc kubenswrapper[4717]: I1007 14:37:42.228807 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dbhbb" podStartSLOduration=2.560992877 podStartE2EDuration="5.228786853s" podCreationTimestamp="2025-10-07 14:37:37 +0000 UTC" firstStartedPulling="2025-10-07 14:37:39.173828894 +0000 UTC m=+2641.001754686" lastFinishedPulling="2025-10-07 14:37:41.84162287 +0000 UTC m=+2643.669548662" observedRunningTime="2025-10-07 14:37:42.21944259 +0000 UTC m=+2644.047368392" watchObservedRunningTime="2025-10-07 14:37:42.228786853 +0000 UTC m=+2644.056712645" Oct 07 14:37:47 crc kubenswrapper[4717]: I1007 14:37:47.956299 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:47 crc kubenswrapper[4717]: I1007 14:37:47.956830 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:48 crc kubenswrapper[4717]: I1007 14:37:48.007525 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:48 crc kubenswrapper[4717]: I1007 14:37:48.296405 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:48 crc kubenswrapper[4717]: I1007 14:37:48.339475 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dbhbb"] Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.267087 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dbhbb" podUID="154135f7-3c97-49e5-a899-6e141275db60" containerName="registry-server" containerID="cri-o://233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8" gracePeriod=2 Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.705576 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.787836 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdpkd\" (UniqueName: \"kubernetes.io/projected/154135f7-3c97-49e5-a899-6e141275db60-kube-api-access-zdpkd\") pod \"154135f7-3c97-49e5-a899-6e141275db60\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.787990 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-catalog-content\") pod \"154135f7-3c97-49e5-a899-6e141275db60\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.788242 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-utilities\") pod \"154135f7-3c97-49e5-a899-6e141275db60\" (UID: \"154135f7-3c97-49e5-a899-6e141275db60\") " Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.788816 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-utilities" (OuterVolumeSpecName: "utilities") pod "154135f7-3c97-49e5-a899-6e141275db60" (UID: "154135f7-3c97-49e5-a899-6e141275db60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.792671 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154135f7-3c97-49e5-a899-6e141275db60-kube-api-access-zdpkd" (OuterVolumeSpecName: "kube-api-access-zdpkd") pod "154135f7-3c97-49e5-a899-6e141275db60" (UID: "154135f7-3c97-49e5-a899-6e141275db60"). InnerVolumeSpecName "kube-api-access-zdpkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.871616 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "154135f7-3c97-49e5-a899-6e141275db60" (UID: "154135f7-3c97-49e5-a899-6e141275db60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.890429 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.890463 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdpkd\" (UniqueName: \"kubernetes.io/projected/154135f7-3c97-49e5-a899-6e141275db60-kube-api-access-zdpkd\") on node \"crc\" DevicePath \"\"" Oct 07 14:37:50 crc kubenswrapper[4717]: I1007 14:37:50.890481 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/154135f7-3c97-49e5-a899-6e141275db60-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.280051 4717 generic.go:334] "Generic (PLEG): container finished" podID="154135f7-3c97-49e5-a899-6e141275db60" containerID="233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8" exitCode=0 Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.280096 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbhbb" event={"ID":"154135f7-3c97-49e5-a899-6e141275db60","Type":"ContainerDied","Data":"233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8"} Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.280124 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dbhbb" event={"ID":"154135f7-3c97-49e5-a899-6e141275db60","Type":"ContainerDied","Data":"df3cd755d48d7037cfca6fa2ce50b00e8bc6050b6e602eecd12a1fc66df51535"} Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.280122 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dbhbb" Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.280142 4717 scope.go:117] "RemoveContainer" containerID="233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8" Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.306913 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dbhbb"] Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.308261 4717 scope.go:117] "RemoveContainer" containerID="1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf" Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.315468 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dbhbb"] Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.327567 4717 scope.go:117] "RemoveContainer" containerID="203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57" Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.370107 4717 scope.go:117] "RemoveContainer" containerID="233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8" Oct 07 14:37:51 crc kubenswrapper[4717]: E1007 14:37:51.370660 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8\": container with ID starting with 233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8 not found: ID does not exist" containerID="233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8" Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.370701 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8"} err="failed to get container status \"233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8\": rpc error: code = NotFound desc = could not find container \"233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8\": container with ID starting with 233bff907a07c17a1beb4a7697f991e84219d044b8a6a84cdc7e2f1c9ed3d2b8 not found: ID does not exist" Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.370730 4717 scope.go:117] "RemoveContainer" containerID="1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf" Oct 07 14:37:51 crc kubenswrapper[4717]: E1007 14:37:51.371382 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf\": container with ID starting with 1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf not found: ID does not exist" containerID="1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf" Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.371406 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf"} err="failed to get container status \"1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf\": rpc error: code = NotFound desc = could not find container \"1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf\": container with ID starting with 1d776fe397a8d027ac1c81594388f7b44cbd0ce854b63fb32eb24ff68687c2bf not found: ID does not exist" Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.371422 4717 scope.go:117] "RemoveContainer" containerID="203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57" Oct 07 14:37:51 crc kubenswrapper[4717]: E1007 14:37:51.371687 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57\": container with ID starting with 203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57 not found: ID does not exist" containerID="203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57" Oct 07 14:37:51 crc kubenswrapper[4717]: I1007 14:37:51.371721 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57"} err="failed to get container status \"203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57\": rpc error: code = NotFound desc = could not find container \"203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57\": container with ID starting with 203c8d0084962bfe33f1ceb5c2d778aa2dafa73ffc11651b6ba3b16b23121e57 not found: ID does not exist" Oct 07 14:37:52 crc kubenswrapper[4717]: I1007 14:37:52.879114 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154135f7-3c97-49e5-a899-6e141275db60" path="/var/lib/kubelet/pods/154135f7-3c97-49e5-a899-6e141275db60/volumes" Oct 07 14:38:01 crc kubenswrapper[4717]: I1007 14:38:01.610453 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:38:01 crc kubenswrapper[4717]: I1007 14:38:01.610986 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.086809 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j55kk"] Oct 07 14:38:24 crc kubenswrapper[4717]: E1007 14:38:24.087828 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154135f7-3c97-49e5-a899-6e141275db60" containerName="extract-content" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.087846 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="154135f7-3c97-49e5-a899-6e141275db60" containerName="extract-content" Oct 07 14:38:24 crc kubenswrapper[4717]: E1007 14:38:24.087902 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154135f7-3c97-49e5-a899-6e141275db60" containerName="extract-utilities" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.087910 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="154135f7-3c97-49e5-a899-6e141275db60" containerName="extract-utilities" Oct 07 14:38:24 crc kubenswrapper[4717]: E1007 14:38:24.087921 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154135f7-3c97-49e5-a899-6e141275db60" containerName="registry-server" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.087932 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="154135f7-3c97-49e5-a899-6e141275db60" containerName="registry-server" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.088175 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="154135f7-3c97-49e5-a899-6e141275db60" containerName="registry-server" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.089858 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.103400 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j55kk"] Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.275613 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fccjc\" (UniqueName: \"kubernetes.io/projected/30459034-5617-4f01-ade1-fe6c81216f48-kube-api-access-fccjc\") pod \"certified-operators-j55kk\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.276357 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-utilities\") pod \"certified-operators-j55kk\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.276396 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-catalog-content\") pod \"certified-operators-j55kk\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.377750 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-utilities\") pod \"certified-operators-j55kk\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.377807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-catalog-content\") pod \"certified-operators-j55kk\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.377865 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fccjc\" (UniqueName: \"kubernetes.io/projected/30459034-5617-4f01-ade1-fe6c81216f48-kube-api-access-fccjc\") pod \"certified-operators-j55kk\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.378398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-catalog-content\") pod \"certified-operators-j55kk\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.378398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-utilities\") pod \"certified-operators-j55kk\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.404809 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fccjc\" (UniqueName: \"kubernetes.io/projected/30459034-5617-4f01-ade1-fe6c81216f48-kube-api-access-fccjc\") pod \"certified-operators-j55kk\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.409645 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:24 crc kubenswrapper[4717]: I1007 14:38:24.933599 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j55kk"] Oct 07 14:38:25 crc kubenswrapper[4717]: I1007 14:38:25.609734 4717 generic.go:334] "Generic (PLEG): container finished" podID="30459034-5617-4f01-ade1-fe6c81216f48" containerID="567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac" exitCode=0 Oct 07 14:38:25 crc kubenswrapper[4717]: I1007 14:38:25.609856 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55kk" event={"ID":"30459034-5617-4f01-ade1-fe6c81216f48","Type":"ContainerDied","Data":"567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac"} Oct 07 14:38:25 crc kubenswrapper[4717]: I1007 14:38:25.610433 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55kk" event={"ID":"30459034-5617-4f01-ade1-fe6c81216f48","Type":"ContainerStarted","Data":"4b68ce05ae9c3d780eb3dfe3dd2f0f6edb8e159f4abd5c9f488f736c203421f5"} Oct 07 14:38:27 crc kubenswrapper[4717]: I1007 14:38:27.629397 4717 generic.go:334] "Generic (PLEG): container finished" podID="30459034-5617-4f01-ade1-fe6c81216f48" containerID="80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6" exitCode=0 Oct 07 14:38:27 crc kubenswrapper[4717]: I1007 14:38:27.629492 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55kk" event={"ID":"30459034-5617-4f01-ade1-fe6c81216f48","Type":"ContainerDied","Data":"80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6"} Oct 07 14:38:29 crc kubenswrapper[4717]: I1007 14:38:29.649829 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55kk" event={"ID":"30459034-5617-4f01-ade1-fe6c81216f48","Type":"ContainerStarted","Data":"8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4"} Oct 07 14:38:31 crc kubenswrapper[4717]: I1007 14:38:31.610313 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:38:31 crc kubenswrapper[4717]: I1007 14:38:31.610628 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:38:34 crc kubenswrapper[4717]: I1007 14:38:34.410312 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:34 crc kubenswrapper[4717]: I1007 14:38:34.410389 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:34 crc kubenswrapper[4717]: I1007 14:38:34.464037 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:34 crc kubenswrapper[4717]: I1007 14:38:34.490365 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j55kk" podStartSLOduration=7.652700489 podStartE2EDuration="10.490348076s" podCreationTimestamp="2025-10-07 14:38:24 +0000 UTC" firstStartedPulling="2025-10-07 14:38:25.611712577 +0000 UTC m=+2687.439638369" lastFinishedPulling="2025-10-07 14:38:28.449360164 +0000 UTC m=+2690.277285956" observedRunningTime="2025-10-07 14:38:29.669085661 +0000 UTC m=+2691.497011453" watchObservedRunningTime="2025-10-07 14:38:34.490348076 +0000 UTC m=+2696.318273868" Oct 07 14:38:34 crc kubenswrapper[4717]: I1007 14:38:34.748161 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:34 crc kubenswrapper[4717]: I1007 14:38:34.810519 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j55kk"] Oct 07 14:38:36 crc kubenswrapper[4717]: I1007 14:38:36.713043 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j55kk" podUID="30459034-5617-4f01-ade1-fe6c81216f48" containerName="registry-server" containerID="cri-o://8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4" gracePeriod=2 Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.191057 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.359035 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-utilities\") pod \"30459034-5617-4f01-ade1-fe6c81216f48\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.359312 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-catalog-content\") pod \"30459034-5617-4f01-ade1-fe6c81216f48\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.359369 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fccjc\" (UniqueName: \"kubernetes.io/projected/30459034-5617-4f01-ade1-fe6c81216f48-kube-api-access-fccjc\") pod \"30459034-5617-4f01-ade1-fe6c81216f48\" (UID: \"30459034-5617-4f01-ade1-fe6c81216f48\") " Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.360161 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-utilities" (OuterVolumeSpecName: "utilities") pod "30459034-5617-4f01-ade1-fe6c81216f48" (UID: "30459034-5617-4f01-ade1-fe6c81216f48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.366733 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30459034-5617-4f01-ade1-fe6c81216f48-kube-api-access-fccjc" (OuterVolumeSpecName: "kube-api-access-fccjc") pod "30459034-5617-4f01-ade1-fe6c81216f48" (UID: "30459034-5617-4f01-ade1-fe6c81216f48"). InnerVolumeSpecName "kube-api-access-fccjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.406474 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30459034-5617-4f01-ade1-fe6c81216f48" (UID: "30459034-5617-4f01-ade1-fe6c81216f48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.462442 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.462488 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30459034-5617-4f01-ade1-fe6c81216f48-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.462505 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fccjc\" (UniqueName: \"kubernetes.io/projected/30459034-5617-4f01-ade1-fe6c81216f48-kube-api-access-fccjc\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.724588 4717 generic.go:334] "Generic (PLEG): container finished" podID="30459034-5617-4f01-ade1-fe6c81216f48" containerID="8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4" exitCode=0 Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.724625 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55kk" event={"ID":"30459034-5617-4f01-ade1-fe6c81216f48","Type":"ContainerDied","Data":"8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4"} Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.724659 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j55kk" event={"ID":"30459034-5617-4f01-ade1-fe6c81216f48","Type":"ContainerDied","Data":"4b68ce05ae9c3d780eb3dfe3dd2f0f6edb8e159f4abd5c9f488f736c203421f5"} Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.724673 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j55kk" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.724699 4717 scope.go:117] "RemoveContainer" containerID="8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.749118 4717 scope.go:117] "RemoveContainer" containerID="80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.754280 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j55kk"] Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.765275 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j55kk"] Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.784960 4717 scope.go:117] "RemoveContainer" containerID="567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.813696 4717 scope.go:117] "RemoveContainer" containerID="8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4" Oct 07 14:38:37 crc kubenswrapper[4717]: E1007 14:38:37.814219 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4\": container with ID starting with 8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4 not found: ID does not exist" containerID="8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.814266 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4"} err="failed to get container status \"8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4\": rpc error: code = NotFound desc = could not find container \"8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4\": container with ID starting with 8d70bb1926f0c591778cfc0d979235bbaa366db04361178bc3d4526109ff17c4 not found: ID does not exist" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.814297 4717 scope.go:117] "RemoveContainer" containerID="80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6" Oct 07 14:38:37 crc kubenswrapper[4717]: E1007 14:38:37.814691 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6\": container with ID starting with 80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6 not found: ID does not exist" containerID="80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.814726 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6"} err="failed to get container status \"80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6\": rpc error: code = NotFound desc = could not find container \"80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6\": container with ID starting with 80543c3fb6bfa3f9f14127533146b62336076ac0acbaeb3ff1bd9658bf93e0f6 not found: ID does not exist" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.814750 4717 scope.go:117] "RemoveContainer" containerID="567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac" Oct 07 14:38:37 crc kubenswrapper[4717]: E1007 14:38:37.815106 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac\": container with ID starting with 567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac not found: ID does not exist" containerID="567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac" Oct 07 14:38:37 crc kubenswrapper[4717]: I1007 14:38:37.815132 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac"} err="failed to get container status \"567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac\": rpc error: code = NotFound desc = could not find container \"567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac\": container with ID starting with 567c9e63e5ab6ece200d688408606820e1774f677ca7d0a1e5cc59dc1329a8ac not found: ID does not exist" Oct 07 14:38:38 crc kubenswrapper[4717]: I1007 14:38:38.878347 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30459034-5617-4f01-ade1-fe6c81216f48" path="/var/lib/kubelet/pods/30459034-5617-4f01-ade1-fe6c81216f48/volumes" Oct 07 14:38:44 crc kubenswrapper[4717]: I1007 14:38:44.793000 4717 generic.go:334] "Generic (PLEG): container finished" podID="44a1daee-eac0-4c51-ae29-1afa919bcb68" containerID="f5bf05bf51c21cce84063e9ddae805b32897289b3ec685b38c90a79ef921e1ea" exitCode=0 Oct 07 14:38:44 crc kubenswrapper[4717]: I1007 14:38:44.793093 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" event={"ID":"44a1daee-eac0-4c51-ae29-1afa919bcb68","Type":"ContainerDied","Data":"f5bf05bf51c21cce84063e9ddae805b32897289b3ec685b38c90a79ef921e1ea"} Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.177975 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.347560 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-combined-ca-bundle\") pod \"44a1daee-eac0-4c51-ae29-1afa919bcb68\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.347626 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-ssh-key\") pod \"44a1daee-eac0-4c51-ae29-1afa919bcb68\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.347709 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-0\") pod \"44a1daee-eac0-4c51-ae29-1afa919bcb68\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.347754 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-inventory\") pod \"44a1daee-eac0-4c51-ae29-1afa919bcb68\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.347954 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggnpd\" (UniqueName: \"kubernetes.io/projected/44a1daee-eac0-4c51-ae29-1afa919bcb68-kube-api-access-ggnpd\") pod \"44a1daee-eac0-4c51-ae29-1afa919bcb68\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.347981 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-extra-config-0\") pod \"44a1daee-eac0-4c51-ae29-1afa919bcb68\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.348049 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-0\") pod \"44a1daee-eac0-4c51-ae29-1afa919bcb68\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.348092 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-1\") pod \"44a1daee-eac0-4c51-ae29-1afa919bcb68\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.348117 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-1\") pod \"44a1daee-eac0-4c51-ae29-1afa919bcb68\" (UID: \"44a1daee-eac0-4c51-ae29-1afa919bcb68\") " Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.363029 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "44a1daee-eac0-4c51-ae29-1afa919bcb68" (UID: "44a1daee-eac0-4c51-ae29-1afa919bcb68"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.363088 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44a1daee-eac0-4c51-ae29-1afa919bcb68-kube-api-access-ggnpd" (OuterVolumeSpecName: "kube-api-access-ggnpd") pod "44a1daee-eac0-4c51-ae29-1afa919bcb68" (UID: "44a1daee-eac0-4c51-ae29-1afa919bcb68"). InnerVolumeSpecName "kube-api-access-ggnpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.381968 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "44a1daee-eac0-4c51-ae29-1afa919bcb68" (UID: "44a1daee-eac0-4c51-ae29-1afa919bcb68"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.386958 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "44a1daee-eac0-4c51-ae29-1afa919bcb68" (UID: "44a1daee-eac0-4c51-ae29-1afa919bcb68"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.387042 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "44a1daee-eac0-4c51-ae29-1afa919bcb68" (UID: "44a1daee-eac0-4c51-ae29-1afa919bcb68"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.387497 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "44a1daee-eac0-4c51-ae29-1afa919bcb68" (UID: "44a1daee-eac0-4c51-ae29-1afa919bcb68"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.388766 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "44a1daee-eac0-4c51-ae29-1afa919bcb68" (UID: "44a1daee-eac0-4c51-ae29-1afa919bcb68"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.389961 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-inventory" (OuterVolumeSpecName: "inventory") pod "44a1daee-eac0-4c51-ae29-1afa919bcb68" (UID: "44a1daee-eac0-4c51-ae29-1afa919bcb68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.393975 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "44a1daee-eac0-4c51-ae29-1afa919bcb68" (UID: "44a1daee-eac0-4c51-ae29-1afa919bcb68"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.450198 4717 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.450233 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.450244 4717 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.450253 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.450263 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggnpd\" (UniqueName: \"kubernetes.io/projected/44a1daee-eac0-4c51-ae29-1afa919bcb68-kube-api-access-ggnpd\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.450273 4717 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.450282 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.450291 4717 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.450299 4717 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/44a1daee-eac0-4c51-ae29-1afa919bcb68-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.810481 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" event={"ID":"44a1daee-eac0-4c51-ae29-1afa919bcb68","Type":"ContainerDied","Data":"3231075ad738686a4a270fc50139cd6f54874f71cfae03d60e040b8f6ab6e235"} Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.810532 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3231075ad738686a4a270fc50139cd6f54874f71cfae03d60e040b8f6ab6e235" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.810948 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xxwpx" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.886842 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj"] Oct 07 14:38:46 crc kubenswrapper[4717]: E1007 14:38:46.887596 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30459034-5617-4f01-ade1-fe6c81216f48" containerName="extract-utilities" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.887619 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="30459034-5617-4f01-ade1-fe6c81216f48" containerName="extract-utilities" Oct 07 14:38:46 crc kubenswrapper[4717]: E1007 14:38:46.887634 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44a1daee-eac0-4c51-ae29-1afa919bcb68" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.887642 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="44a1daee-eac0-4c51-ae29-1afa919bcb68" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 14:38:46 crc kubenswrapper[4717]: E1007 14:38:46.887669 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30459034-5617-4f01-ade1-fe6c81216f48" containerName="extract-content" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.887676 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="30459034-5617-4f01-ade1-fe6c81216f48" containerName="extract-content" Oct 07 14:38:46 crc kubenswrapper[4717]: E1007 14:38:46.887692 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30459034-5617-4f01-ade1-fe6c81216f48" containerName="registry-server" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.887701 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="30459034-5617-4f01-ade1-fe6c81216f48" containerName="registry-server" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.887948 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="44a1daee-eac0-4c51-ae29-1afa919bcb68" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.887970 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="30459034-5617-4f01-ade1-fe6c81216f48" containerName="registry-server" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.888708 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.892276 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.893943 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.893979 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d54z2" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.893994 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.894245 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 07 14:38:46 crc kubenswrapper[4717]: I1007 14:38:46.910177 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj"] Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.063488 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.063558 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.063865 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cf58\" (UniqueName: \"kubernetes.io/projected/87e2107a-8eec-497a-b811-7d339dbfe176-kube-api-access-7cf58\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.063953 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.064033 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.064200 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.064306 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.166271 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.166384 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.166536 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.166595 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cf58\" (UniqueName: \"kubernetes.io/projected/87e2107a-8eec-497a-b811-7d339dbfe176-kube-api-access-7cf58\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.166629 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.166666 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.166710 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.170445 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.170849 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.171734 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.178805 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.179837 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.182408 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cf58\" (UniqueName: \"kubernetes.io/projected/87e2107a-8eec-497a-b811-7d339dbfe176-kube-api-access-7cf58\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.183342 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.211111 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.754952 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj"] Oct 07 14:38:47 crc kubenswrapper[4717]: I1007 14:38:47.819673 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" event={"ID":"87e2107a-8eec-497a-b811-7d339dbfe176","Type":"ContainerStarted","Data":"bbe20012e6b9ba3298ba3bf4ca6e876f34372ea5ad7ae64e2c99058f6e0624d9"} Oct 07 14:38:48 crc kubenswrapper[4717]: I1007 14:38:48.828691 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" event={"ID":"87e2107a-8eec-497a-b811-7d339dbfe176","Type":"ContainerStarted","Data":"b341497853280a24ce13415acc2bcbc2721675d0086b61165c4dd30110db3bbd"} Oct 07 14:38:49 crc kubenswrapper[4717]: I1007 14:38:49.857978 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" podStartSLOduration=3.113440358 podStartE2EDuration="3.857958956s" podCreationTimestamp="2025-10-07 14:38:46 +0000 UTC" firstStartedPulling="2025-10-07 14:38:47.767416811 +0000 UTC m=+2709.595342603" lastFinishedPulling="2025-10-07 14:38:48.511935409 +0000 UTC m=+2710.339861201" observedRunningTime="2025-10-07 14:38:49.849332703 +0000 UTC m=+2711.677258495" watchObservedRunningTime="2025-10-07 14:38:49.857958956 +0000 UTC m=+2711.685884748" Oct 07 14:39:01 crc kubenswrapper[4717]: I1007 14:39:01.610063 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:39:01 crc kubenswrapper[4717]: I1007 14:39:01.610586 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:39:01 crc kubenswrapper[4717]: I1007 14:39:01.610624 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:39:01 crc kubenswrapper[4717]: I1007 14:39:01.611121 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b5cd7afab4830bbc2ddc68beae74171a9b0689b09e6debe85258eea971f5924"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:39:01 crc kubenswrapper[4717]: I1007 14:39:01.611180 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://5b5cd7afab4830bbc2ddc68beae74171a9b0689b09e6debe85258eea971f5924" gracePeriod=600 Oct 07 14:39:01 crc kubenswrapper[4717]: I1007 14:39:01.939078 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="5b5cd7afab4830bbc2ddc68beae74171a9b0689b09e6debe85258eea971f5924" exitCode=0 Oct 07 14:39:01 crc kubenswrapper[4717]: I1007 14:39:01.939146 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"5b5cd7afab4830bbc2ddc68beae74171a9b0689b09e6debe85258eea971f5924"} Oct 07 14:39:01 crc kubenswrapper[4717]: I1007 14:39:01.939430 4717 scope.go:117] "RemoveContainer" containerID="ad0cc45cafb8ba760971c7d3b56fb9b3d757affceaf4ba3abb0d8989d5e4bc7d" Oct 07 14:39:02 crc kubenswrapper[4717]: I1007 14:39:02.955894 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e"} Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.440873 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zwmrh"] Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.447236 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.453775 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwmrh"] Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.533795 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcbbn\" (UniqueName: \"kubernetes.io/projected/6f0f2d7d-35a2-4072-b235-6aab1084ba03-kube-api-access-lcbbn\") pod \"community-operators-zwmrh\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.533959 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-utilities\") pod \"community-operators-zwmrh\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.533987 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-catalog-content\") pod \"community-operators-zwmrh\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.635316 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcbbn\" (UniqueName: \"kubernetes.io/projected/6f0f2d7d-35a2-4072-b235-6aab1084ba03-kube-api-access-lcbbn\") pod \"community-operators-zwmrh\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.635477 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-utilities\") pod \"community-operators-zwmrh\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.635514 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-catalog-content\") pod \"community-operators-zwmrh\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.635960 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-utilities\") pod \"community-operators-zwmrh\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.636088 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-catalog-content\") pod \"community-operators-zwmrh\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.655372 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcbbn\" (UniqueName: \"kubernetes.io/projected/6f0f2d7d-35a2-4072-b235-6aab1084ba03-kube-api-access-lcbbn\") pod \"community-operators-zwmrh\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:44 crc kubenswrapper[4717]: I1007 14:39:44.781178 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:45 crc kubenswrapper[4717]: I1007 14:39:45.817053 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zwmrh"] Oct 07 14:39:46 crc kubenswrapper[4717]: I1007 14:39:46.338406 4717 generic.go:334] "Generic (PLEG): container finished" podID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerID="a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9" exitCode=0 Oct 07 14:39:46 crc kubenswrapper[4717]: I1007 14:39:46.338608 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwmrh" event={"ID":"6f0f2d7d-35a2-4072-b235-6aab1084ba03","Type":"ContainerDied","Data":"a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9"} Oct 07 14:39:46 crc kubenswrapper[4717]: I1007 14:39:46.338782 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwmrh" event={"ID":"6f0f2d7d-35a2-4072-b235-6aab1084ba03","Type":"ContainerStarted","Data":"9616fb3631eabedcb0252140cb8e2dd45de15588a4617d3c2ba5d1daadfc2b47"} Oct 07 14:39:47 crc kubenswrapper[4717]: I1007 14:39:47.351165 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwmrh" event={"ID":"6f0f2d7d-35a2-4072-b235-6aab1084ba03","Type":"ContainerStarted","Data":"c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77"} Oct 07 14:39:48 crc kubenswrapper[4717]: I1007 14:39:48.360521 4717 generic.go:334] "Generic (PLEG): container finished" podID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerID="c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77" exitCode=0 Oct 07 14:39:48 crc kubenswrapper[4717]: I1007 14:39:48.360621 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwmrh" event={"ID":"6f0f2d7d-35a2-4072-b235-6aab1084ba03","Type":"ContainerDied","Data":"c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77"} Oct 07 14:39:49 crc kubenswrapper[4717]: I1007 14:39:49.371681 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwmrh" event={"ID":"6f0f2d7d-35a2-4072-b235-6aab1084ba03","Type":"ContainerStarted","Data":"8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df"} Oct 07 14:39:49 crc kubenswrapper[4717]: I1007 14:39:49.391711 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zwmrh" podStartSLOduration=2.862125056 podStartE2EDuration="5.391695508s" podCreationTimestamp="2025-10-07 14:39:44 +0000 UTC" firstStartedPulling="2025-10-07 14:39:46.340995901 +0000 UTC m=+2768.168921693" lastFinishedPulling="2025-10-07 14:39:48.870566353 +0000 UTC m=+2770.698492145" observedRunningTime="2025-10-07 14:39:49.386104486 +0000 UTC m=+2771.214030278" watchObservedRunningTime="2025-10-07 14:39:49.391695508 +0000 UTC m=+2771.219621300" Oct 07 14:39:54 crc kubenswrapper[4717]: I1007 14:39:54.781364 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:54 crc kubenswrapper[4717]: I1007 14:39:54.781956 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:54 crc kubenswrapper[4717]: I1007 14:39:54.852158 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:55 crc kubenswrapper[4717]: I1007 14:39:55.470524 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:55 crc kubenswrapper[4717]: I1007 14:39:55.508490 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwmrh"] Oct 07 14:39:57 crc kubenswrapper[4717]: I1007 14:39:57.443849 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zwmrh" podUID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerName="registry-server" containerID="cri-o://8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df" gracePeriod=2 Oct 07 14:39:57 crc kubenswrapper[4717]: I1007 14:39:57.899083 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.013610 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-catalog-content\") pod \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.013737 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcbbn\" (UniqueName: \"kubernetes.io/projected/6f0f2d7d-35a2-4072-b235-6aab1084ba03-kube-api-access-lcbbn\") pod \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.013995 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-utilities\") pod \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\" (UID: \"6f0f2d7d-35a2-4072-b235-6aab1084ba03\") " Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.015124 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-utilities" (OuterVolumeSpecName: "utilities") pod "6f0f2d7d-35a2-4072-b235-6aab1084ba03" (UID: "6f0f2d7d-35a2-4072-b235-6aab1084ba03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.026214 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f0f2d7d-35a2-4072-b235-6aab1084ba03-kube-api-access-lcbbn" (OuterVolumeSpecName: "kube-api-access-lcbbn") pod "6f0f2d7d-35a2-4072-b235-6aab1084ba03" (UID: "6f0f2d7d-35a2-4072-b235-6aab1084ba03"). InnerVolumeSpecName "kube-api-access-lcbbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.116429 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcbbn\" (UniqueName: \"kubernetes.io/projected/6f0f2d7d-35a2-4072-b235-6aab1084ba03-kube-api-access-lcbbn\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.116479 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.216028 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f0f2d7d-35a2-4072-b235-6aab1084ba03" (UID: "6f0f2d7d-35a2-4072-b235-6aab1084ba03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.219071 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f0f2d7d-35a2-4072-b235-6aab1084ba03-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.454521 4717 generic.go:334] "Generic (PLEG): container finished" podID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerID="8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df" exitCode=0 Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.454582 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zwmrh" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.454576 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwmrh" event={"ID":"6f0f2d7d-35a2-4072-b235-6aab1084ba03","Type":"ContainerDied","Data":"8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df"} Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.454641 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zwmrh" event={"ID":"6f0f2d7d-35a2-4072-b235-6aab1084ba03","Type":"ContainerDied","Data":"9616fb3631eabedcb0252140cb8e2dd45de15588a4617d3c2ba5d1daadfc2b47"} Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.454667 4717 scope.go:117] "RemoveContainer" containerID="8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.481830 4717 scope.go:117] "RemoveContainer" containerID="c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.484174 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zwmrh"] Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.491822 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zwmrh"] Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.512793 4717 scope.go:117] "RemoveContainer" containerID="a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.547771 4717 scope.go:117] "RemoveContainer" containerID="8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df" Oct 07 14:39:58 crc kubenswrapper[4717]: E1007 14:39:58.548432 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df\": container with ID starting with 8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df not found: ID does not exist" containerID="8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.548469 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df"} err="failed to get container status \"8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df\": rpc error: code = NotFound desc = could not find container \"8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df\": container with ID starting with 8eabfad6f7d44d9057e9cde2dcaf19a649a50bddeb1837f93f9165d89c0f73df not found: ID does not exist" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.548494 4717 scope.go:117] "RemoveContainer" containerID="c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77" Oct 07 14:39:58 crc kubenswrapper[4717]: E1007 14:39:58.548851 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77\": container with ID starting with c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77 not found: ID does not exist" containerID="c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.548877 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77"} err="failed to get container status \"c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77\": rpc error: code = NotFound desc = could not find container \"c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77\": container with ID starting with c58f31d99eb0cb438f7ccf9368aa74b9f9778a413106967e871bae96d0164c77 not found: ID does not exist" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.548894 4717 scope.go:117] "RemoveContainer" containerID="a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9" Oct 07 14:39:58 crc kubenswrapper[4717]: E1007 14:39:58.549435 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9\": container with ID starting with a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9 not found: ID does not exist" containerID="a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.549465 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9"} err="failed to get container status \"a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9\": rpc error: code = NotFound desc = could not find container \"a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9\": container with ID starting with a42af1db66d709c7d7fb0c21b713f7214950b5ab713f447744d962088c4ca1b9 not found: ID does not exist" Oct 07 14:39:58 crc kubenswrapper[4717]: I1007 14:39:58.878558 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" path="/var/lib/kubelet/pods/6f0f2d7d-35a2-4072-b235-6aab1084ba03/volumes" Oct 07 14:41:01 crc kubenswrapper[4717]: I1007 14:41:01.609255 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:41:01 crc kubenswrapper[4717]: I1007 14:41:01.609791 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:41:21 crc kubenswrapper[4717]: I1007 14:41:21.225557 4717 generic.go:334] "Generic (PLEG): container finished" podID="87e2107a-8eec-497a-b811-7d339dbfe176" containerID="b341497853280a24ce13415acc2bcbc2721675d0086b61165c4dd30110db3bbd" exitCode=0 Oct 07 14:41:21 crc kubenswrapper[4717]: I1007 14:41:21.225643 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" event={"ID":"87e2107a-8eec-497a-b811-7d339dbfe176","Type":"ContainerDied","Data":"b341497853280a24ce13415acc2bcbc2721675d0086b61165c4dd30110db3bbd"} Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.685374 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.798981 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-0\") pod \"87e2107a-8eec-497a-b811-7d339dbfe176\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.799118 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-telemetry-combined-ca-bundle\") pod \"87e2107a-8eec-497a-b811-7d339dbfe176\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.799143 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-1\") pod \"87e2107a-8eec-497a-b811-7d339dbfe176\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.799192 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-2\") pod \"87e2107a-8eec-497a-b811-7d339dbfe176\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.799312 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ssh-key\") pod \"87e2107a-8eec-497a-b811-7d339dbfe176\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.799364 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-inventory\") pod \"87e2107a-8eec-497a-b811-7d339dbfe176\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.799406 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cf58\" (UniqueName: \"kubernetes.io/projected/87e2107a-8eec-497a-b811-7d339dbfe176-kube-api-access-7cf58\") pod \"87e2107a-8eec-497a-b811-7d339dbfe176\" (UID: \"87e2107a-8eec-497a-b811-7d339dbfe176\") " Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.807733 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e2107a-8eec-497a-b811-7d339dbfe176-kube-api-access-7cf58" (OuterVolumeSpecName: "kube-api-access-7cf58") pod "87e2107a-8eec-497a-b811-7d339dbfe176" (UID: "87e2107a-8eec-497a-b811-7d339dbfe176"). InnerVolumeSpecName "kube-api-access-7cf58". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.821308 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "87e2107a-8eec-497a-b811-7d339dbfe176" (UID: "87e2107a-8eec-497a-b811-7d339dbfe176"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.831680 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87e2107a-8eec-497a-b811-7d339dbfe176" (UID: "87e2107a-8eec-497a-b811-7d339dbfe176"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.833803 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "87e2107a-8eec-497a-b811-7d339dbfe176" (UID: "87e2107a-8eec-497a-b811-7d339dbfe176"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.833973 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-inventory" (OuterVolumeSpecName: "inventory") pod "87e2107a-8eec-497a-b811-7d339dbfe176" (UID: "87e2107a-8eec-497a-b811-7d339dbfe176"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.836653 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "87e2107a-8eec-497a-b811-7d339dbfe176" (UID: "87e2107a-8eec-497a-b811-7d339dbfe176"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.850153 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "87e2107a-8eec-497a-b811-7d339dbfe176" (UID: "87e2107a-8eec-497a-b811-7d339dbfe176"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.901732 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cf58\" (UniqueName: \"kubernetes.io/projected/87e2107a-8eec-497a-b811-7d339dbfe176-kube-api-access-7cf58\") on node \"crc\" DevicePath \"\"" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.901763 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.901773 4717 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.901782 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.901791 4717 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.901799 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 14:41:22 crc kubenswrapper[4717]: I1007 14:41:22.901808 4717 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87e2107a-8eec-497a-b811-7d339dbfe176-inventory\") on node \"crc\" DevicePath \"\"" Oct 07 14:41:23 crc kubenswrapper[4717]: I1007 14:41:23.243131 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" event={"ID":"87e2107a-8eec-497a-b811-7d339dbfe176","Type":"ContainerDied","Data":"bbe20012e6b9ba3298ba3bf4ca6e876f34372ea5ad7ae64e2c99058f6e0624d9"} Oct 07 14:41:23 crc kubenswrapper[4717]: I1007 14:41:23.243370 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbe20012e6b9ba3298ba3bf4ca6e876f34372ea5ad7ae64e2c99058f6e0624d9" Oct 07 14:41:23 crc kubenswrapper[4717]: I1007 14:41:23.243218 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj" Oct 07 14:41:31 crc kubenswrapper[4717]: I1007 14:41:31.609504 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:41:31 crc kubenswrapper[4717]: I1007 14:41:31.610184 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:42:01 crc kubenswrapper[4717]: I1007 14:42:01.610301 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:42:01 crc kubenswrapper[4717]: I1007 14:42:01.610821 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:42:01 crc kubenswrapper[4717]: I1007 14:42:01.610872 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:42:01 crc kubenswrapper[4717]: I1007 14:42:01.611843 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:42:01 crc kubenswrapper[4717]: I1007 14:42:01.611899 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" gracePeriod=600 Oct 07 14:42:01 crc kubenswrapper[4717]: E1007 14:42:01.734485 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:42:02 crc kubenswrapper[4717]: I1007 14:42:02.596952 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" exitCode=0 Oct 07 14:42:02 crc kubenswrapper[4717]: I1007 14:42:02.597036 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e"} Oct 07 14:42:02 crc kubenswrapper[4717]: I1007 14:42:02.597097 4717 scope.go:117] "RemoveContainer" containerID="5b5cd7afab4830bbc2ddc68beae74171a9b0689b09e6debe85258eea971f5924" Oct 07 14:42:02 crc kubenswrapper[4717]: I1007 14:42:02.597777 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:42:02 crc kubenswrapper[4717]: E1007 14:42:02.598162 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:42:17 crc kubenswrapper[4717]: I1007 14:42:17.868571 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:42:17 crc kubenswrapper[4717]: E1007 14:42:17.869417 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.139970 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 07 14:42:29 crc kubenswrapper[4717]: E1007 14:42:29.140996 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerName="extract-content" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.141032 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerName="extract-content" Oct 07 14:42:29 crc kubenswrapper[4717]: E1007 14:42:29.141051 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerName="registry-server" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.141059 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerName="registry-server" Oct 07 14:42:29 crc kubenswrapper[4717]: E1007 14:42:29.141095 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e2107a-8eec-497a-b811-7d339dbfe176" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.141102 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e2107a-8eec-497a-b811-7d339dbfe176" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 14:42:29 crc kubenswrapper[4717]: E1007 14:42:29.141116 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerName="extract-utilities" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.141123 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerName="extract-utilities" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.141352 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f0f2d7d-35a2-4072-b235-6aab1084ba03" containerName="registry-server" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.141381 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e2107a-8eec-497a-b811-7d339dbfe176" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.142089 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.143649 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.143679 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.144885 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.152936 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.274418 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.274927 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.275199 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.275408 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.275564 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9bnw\" (UniqueName: \"kubernetes.io/projected/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-kube-api-access-t9bnw\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.275677 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.275809 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-config-data\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.275938 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.276067 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.377550 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9bnw\" (UniqueName: \"kubernetes.io/projected/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-kube-api-access-t9bnw\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.377600 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.377634 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-config-data\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.377664 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.377688 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.377767 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.377807 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.377832 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.377855 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.378578 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.378938 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.379275 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.379796 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-config-data\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.380409 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.384521 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.385123 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.391209 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.395398 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9bnw\" (UniqueName: \"kubernetes.io/projected/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-kube-api-access-t9bnw\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.416058 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.472186 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.899878 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 07 14:42:29 crc kubenswrapper[4717]: W1007 14:42:29.901777 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ce6c960_67bf_4ed7_b3bc_b8bdbf53a3d7.slice/crio-5f2594948c35c198d4ac8bf1a8dac623ed44474151b6ce9862382908ba800b06 WatchSource:0}: Error finding container 5f2594948c35c198d4ac8bf1a8dac623ed44474151b6ce9862382908ba800b06: Status 404 returned error can't find the container with id 5f2594948c35c198d4ac8bf1a8dac623ed44474151b6ce9862382908ba800b06 Oct 07 14:42:29 crc kubenswrapper[4717]: I1007 14:42:29.904279 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:42:30 crc kubenswrapper[4717]: I1007 14:42:30.834798 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7","Type":"ContainerStarted","Data":"5f2594948c35c198d4ac8bf1a8dac623ed44474151b6ce9862382908ba800b06"} Oct 07 14:42:31 crc kubenswrapper[4717]: I1007 14:42:31.868317 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:42:31 crc kubenswrapper[4717]: E1007 14:42:31.869176 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:42:46 crc kubenswrapper[4717]: I1007 14:42:46.868899 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:42:46 crc kubenswrapper[4717]: E1007 14:42:46.869714 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:42:56 crc kubenswrapper[4717]: E1007 14:42:56.446197 4717 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 07 14:42:56 crc kubenswrapper[4717]: E1007 14:42:56.447710 4717 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t9bnw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 07 14:42:56 crc kubenswrapper[4717]: E1007 14:42:56.449102 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" Oct 07 14:42:57 crc kubenswrapper[4717]: E1007 14:42:57.083334 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" Oct 07 14:43:00 crc kubenswrapper[4717]: I1007 14:43:00.869223 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:43:00 crc kubenswrapper[4717]: E1007 14:43:00.869837 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:43:09 crc kubenswrapper[4717]: I1007 14:43:09.416852 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 07 14:43:11 crc kubenswrapper[4717]: I1007 14:43:11.192432 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7","Type":"ContainerStarted","Data":"826cda5c8ba26449da2b79bb50d05cc5d4c96cf4c3371a5e201a2643a7e0d86a"} Oct 07 14:43:11 crc kubenswrapper[4717]: I1007 14:43:11.219866 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.709712453 podStartE2EDuration="43.219844521s" podCreationTimestamp="2025-10-07 14:42:28 +0000 UTC" firstStartedPulling="2025-10-07 14:42:29.904105257 +0000 UTC m=+2931.732031049" lastFinishedPulling="2025-10-07 14:43:09.414237325 +0000 UTC m=+2971.242163117" observedRunningTime="2025-10-07 14:43:11.209007887 +0000 UTC m=+2973.036933679" watchObservedRunningTime="2025-10-07 14:43:11.219844521 +0000 UTC m=+2973.047770313" Oct 07 14:43:11 crc kubenswrapper[4717]: I1007 14:43:11.868697 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:43:11 crc kubenswrapper[4717]: E1007 14:43:11.869225 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:43:22 crc kubenswrapper[4717]: I1007 14:43:22.869719 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:43:22 crc kubenswrapper[4717]: E1007 14:43:22.870390 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:43:34 crc kubenswrapper[4717]: I1007 14:43:34.869344 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:43:34 crc kubenswrapper[4717]: E1007 14:43:34.870242 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:43:49 crc kubenswrapper[4717]: I1007 14:43:49.869186 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:43:49 crc kubenswrapper[4717]: E1007 14:43:49.870846 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:44:02 crc kubenswrapper[4717]: I1007 14:44:02.868303 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:44:02 crc kubenswrapper[4717]: E1007 14:44:02.869396 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:44:17 crc kubenswrapper[4717]: I1007 14:44:17.868241 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:44:17 crc kubenswrapper[4717]: E1007 14:44:17.869068 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:44:29 crc kubenswrapper[4717]: I1007 14:44:29.869064 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:44:29 crc kubenswrapper[4717]: E1007 14:44:29.869919 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:44:41 crc kubenswrapper[4717]: I1007 14:44:41.869069 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:44:41 crc kubenswrapper[4717]: E1007 14:44:41.870747 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:44:54 crc kubenswrapper[4717]: I1007 14:44:54.869247 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:44:54 crc kubenswrapper[4717]: E1007 14:44:54.870246 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.193563 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf"] Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.196562 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.207449 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.207918 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.208285 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf"] Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.323387 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v52lm\" (UniqueName: \"kubernetes.io/projected/ba418081-7600-4056-9dbf-b322deceee98-kube-api-access-v52lm\") pod \"collect-profiles-29330805-kvzcf\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.323501 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba418081-7600-4056-9dbf-b322deceee98-config-volume\") pod \"collect-profiles-29330805-kvzcf\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.323637 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba418081-7600-4056-9dbf-b322deceee98-secret-volume\") pod \"collect-profiles-29330805-kvzcf\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.424919 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v52lm\" (UniqueName: \"kubernetes.io/projected/ba418081-7600-4056-9dbf-b322deceee98-kube-api-access-v52lm\") pod \"collect-profiles-29330805-kvzcf\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.425112 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba418081-7600-4056-9dbf-b322deceee98-config-volume\") pod \"collect-profiles-29330805-kvzcf\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.425277 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba418081-7600-4056-9dbf-b322deceee98-secret-volume\") pod \"collect-profiles-29330805-kvzcf\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.426318 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba418081-7600-4056-9dbf-b322deceee98-config-volume\") pod \"collect-profiles-29330805-kvzcf\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.448038 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba418081-7600-4056-9dbf-b322deceee98-secret-volume\") pod \"collect-profiles-29330805-kvzcf\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.450966 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v52lm\" (UniqueName: \"kubernetes.io/projected/ba418081-7600-4056-9dbf-b322deceee98-kube-api-access-v52lm\") pod \"collect-profiles-29330805-kvzcf\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:00 crc kubenswrapper[4717]: I1007 14:45:00.523999 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:01 crc kubenswrapper[4717]: I1007 14:45:01.035389 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf"] Oct 07 14:45:01 crc kubenswrapper[4717]: I1007 14:45:01.142930 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" event={"ID":"ba418081-7600-4056-9dbf-b322deceee98","Type":"ContainerStarted","Data":"9289e88832d4d05f0860a59531e82b2f4edc73366acd6c181085026970cc9ce2"} Oct 07 14:45:02 crc kubenswrapper[4717]: I1007 14:45:02.153039 4717 generic.go:334] "Generic (PLEG): container finished" podID="ba418081-7600-4056-9dbf-b322deceee98" containerID="99ec3f4f3eb99d1e23e778823ad1333cd7a0e053be9351bd985b1f71979245f7" exitCode=0 Oct 07 14:45:02 crc kubenswrapper[4717]: I1007 14:45:02.153085 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" event={"ID":"ba418081-7600-4056-9dbf-b322deceee98","Type":"ContainerDied","Data":"99ec3f4f3eb99d1e23e778823ad1333cd7a0e053be9351bd985b1f71979245f7"} Oct 07 14:45:03 crc kubenswrapper[4717]: I1007 14:45:03.748199 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:03 crc kubenswrapper[4717]: I1007 14:45:03.908841 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba418081-7600-4056-9dbf-b322deceee98-config-volume\") pod \"ba418081-7600-4056-9dbf-b322deceee98\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " Oct 07 14:45:03 crc kubenswrapper[4717]: I1007 14:45:03.909029 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba418081-7600-4056-9dbf-b322deceee98-secret-volume\") pod \"ba418081-7600-4056-9dbf-b322deceee98\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " Oct 07 14:45:03 crc kubenswrapper[4717]: I1007 14:45:03.909261 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v52lm\" (UniqueName: \"kubernetes.io/projected/ba418081-7600-4056-9dbf-b322deceee98-kube-api-access-v52lm\") pod \"ba418081-7600-4056-9dbf-b322deceee98\" (UID: \"ba418081-7600-4056-9dbf-b322deceee98\") " Oct 07 14:45:03 crc kubenswrapper[4717]: I1007 14:45:03.909835 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba418081-7600-4056-9dbf-b322deceee98-config-volume" (OuterVolumeSpecName: "config-volume") pod "ba418081-7600-4056-9dbf-b322deceee98" (UID: "ba418081-7600-4056-9dbf-b322deceee98"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 14:45:03 crc kubenswrapper[4717]: I1007 14:45:03.915718 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba418081-7600-4056-9dbf-b322deceee98-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ba418081-7600-4056-9dbf-b322deceee98" (UID: "ba418081-7600-4056-9dbf-b322deceee98"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 14:45:03 crc kubenswrapper[4717]: I1007 14:45:03.916250 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba418081-7600-4056-9dbf-b322deceee98-kube-api-access-v52lm" (OuterVolumeSpecName: "kube-api-access-v52lm") pod "ba418081-7600-4056-9dbf-b322deceee98" (UID: "ba418081-7600-4056-9dbf-b322deceee98"). InnerVolumeSpecName "kube-api-access-v52lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:45:04 crc kubenswrapper[4717]: I1007 14:45:04.012964 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v52lm\" (UniqueName: \"kubernetes.io/projected/ba418081-7600-4056-9dbf-b322deceee98-kube-api-access-v52lm\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:04 crc kubenswrapper[4717]: I1007 14:45:04.013075 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba418081-7600-4056-9dbf-b322deceee98-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:04 crc kubenswrapper[4717]: I1007 14:45:04.013089 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba418081-7600-4056-9dbf-b322deceee98-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 14:45:04 crc kubenswrapper[4717]: I1007 14:45:04.174303 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" event={"ID":"ba418081-7600-4056-9dbf-b322deceee98","Type":"ContainerDied","Data":"9289e88832d4d05f0860a59531e82b2f4edc73366acd6c181085026970cc9ce2"} Oct 07 14:45:04 crc kubenswrapper[4717]: I1007 14:45:04.174362 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9289e88832d4d05f0860a59531e82b2f4edc73366acd6c181085026970cc9ce2" Oct 07 14:45:04 crc kubenswrapper[4717]: I1007 14:45:04.174386 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf" Oct 07 14:45:04 crc kubenswrapper[4717]: I1007 14:45:04.831650 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk"] Oct 07 14:45:04 crc kubenswrapper[4717]: I1007 14:45:04.841310 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330760-525fk"] Oct 07 14:45:04 crc kubenswrapper[4717]: I1007 14:45:04.885245 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3120106e-abb9-496f-ba83-81806347c89c" path="/var/lib/kubelet/pods/3120106e-abb9-496f-ba83-81806347c89c/volumes" Oct 07 14:45:05 crc kubenswrapper[4717]: I1007 14:45:05.868105 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:45:05 crc kubenswrapper[4717]: E1007 14:45:05.868712 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:45:16 crc kubenswrapper[4717]: I1007 14:45:16.868771 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:45:16 crc kubenswrapper[4717]: E1007 14:45:16.869520 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:45:27 crc kubenswrapper[4717]: I1007 14:45:27.868402 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:45:27 crc kubenswrapper[4717]: E1007 14:45:27.869161 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:45:40 crc kubenswrapper[4717]: I1007 14:45:40.868625 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:45:40 crc kubenswrapper[4717]: E1007 14:45:40.869399 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:45:45 crc kubenswrapper[4717]: I1007 14:45:45.137163 4717 scope.go:117] "RemoveContainer" containerID="4029f85a0293606220d499720dfc611e8bf3fd6581f93e370c295104a36a49b5" Oct 07 14:45:54 crc kubenswrapper[4717]: I1007 14:45:54.869154 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:45:54 crc kubenswrapper[4717]: E1007 14:45:54.869979 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:46:09 crc kubenswrapper[4717]: I1007 14:46:09.869275 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:46:09 crc kubenswrapper[4717]: E1007 14:46:09.869958 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:46:23 crc kubenswrapper[4717]: I1007 14:46:23.869714 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:46:23 crc kubenswrapper[4717]: E1007 14:46:23.873687 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:46:35 crc kubenswrapper[4717]: I1007 14:46:35.869209 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:46:35 crc kubenswrapper[4717]: E1007 14:46:35.869895 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:46:48 crc kubenswrapper[4717]: I1007 14:46:48.876254 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:46:48 crc kubenswrapper[4717]: E1007 14:46:48.877149 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:47:03 crc kubenswrapper[4717]: I1007 14:47:03.869440 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:47:04 crc kubenswrapper[4717]: I1007 14:47:04.295437 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"41ca09c5ec0884558ea44f72b072c002159c58be55dba17ea5b1e22e93ec5d31"} Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.178731 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpvh"] Oct 07 14:48:05 crc kubenswrapper[4717]: E1007 14:48:05.179672 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba418081-7600-4056-9dbf-b322deceee98" containerName="collect-profiles" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.179688 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba418081-7600-4056-9dbf-b322deceee98" containerName="collect-profiles" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.179890 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba418081-7600-4056-9dbf-b322deceee98" containerName="collect-profiles" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.181479 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.195328 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpvh"] Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.337247 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-utilities\") pod \"redhat-marketplace-tzpvh\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.337457 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9wn9\" (UniqueName: \"kubernetes.io/projected/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-kube-api-access-k9wn9\") pod \"redhat-marketplace-tzpvh\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.337545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-catalog-content\") pod \"redhat-marketplace-tzpvh\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.439499 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-utilities\") pod \"redhat-marketplace-tzpvh\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.440042 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9wn9\" (UniqueName: \"kubernetes.io/projected/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-kube-api-access-k9wn9\") pod \"redhat-marketplace-tzpvh\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.440132 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-catalog-content\") pod \"redhat-marketplace-tzpvh\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.440208 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-utilities\") pod \"redhat-marketplace-tzpvh\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.440604 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-catalog-content\") pod \"redhat-marketplace-tzpvh\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.466734 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9wn9\" (UniqueName: \"kubernetes.io/projected/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-kube-api-access-k9wn9\") pod \"redhat-marketplace-tzpvh\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:05 crc kubenswrapper[4717]: I1007 14:48:05.510486 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.032425 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpvh"] Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.177581 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kmm6w"] Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.182972 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.192703 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kmm6w"] Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.358486 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-utilities\") pod \"redhat-operators-kmm6w\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.358563 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jlv\" (UniqueName: \"kubernetes.io/projected/d6aee2a2-becc-44fb-8950-a9464e8b518d-kube-api-access-v2jlv\") pod \"redhat-operators-kmm6w\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.358759 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-catalog-content\") pod \"redhat-operators-kmm6w\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.461176 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-catalog-content\") pod \"redhat-operators-kmm6w\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.461312 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-utilities\") pod \"redhat-operators-kmm6w\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.461362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jlv\" (UniqueName: \"kubernetes.io/projected/d6aee2a2-becc-44fb-8950-a9464e8b518d-kube-api-access-v2jlv\") pod \"redhat-operators-kmm6w\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.461666 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-catalog-content\") pod \"redhat-operators-kmm6w\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.461824 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-utilities\") pod \"redhat-operators-kmm6w\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.492749 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jlv\" (UniqueName: \"kubernetes.io/projected/d6aee2a2-becc-44fb-8950-a9464e8b518d-kube-api-access-v2jlv\") pod \"redhat-operators-kmm6w\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.525693 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.905585 4717 generic.go:334] "Generic (PLEG): container finished" podID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerID="d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae" exitCode=0 Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.905901 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpvh" event={"ID":"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267","Type":"ContainerDied","Data":"d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae"} Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.905939 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpvh" event={"ID":"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267","Type":"ContainerStarted","Data":"ff65cea4839ab90489e20152b881e25d420930db94da74035f1f49b9d0daace2"} Oct 07 14:48:06 crc kubenswrapper[4717]: I1007 14:48:06.909986 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:48:07 crc kubenswrapper[4717]: W1007 14:48:07.032768 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6aee2a2_becc_44fb_8950_a9464e8b518d.slice/crio-97a8ef2155a61001d24afc4730a0e472eadd385826569d7cbfd5ea2f9a92560c WatchSource:0}: Error finding container 97a8ef2155a61001d24afc4730a0e472eadd385826569d7cbfd5ea2f9a92560c: Status 404 returned error can't find the container with id 97a8ef2155a61001d24afc4730a0e472eadd385826569d7cbfd5ea2f9a92560c Oct 07 14:48:07 crc kubenswrapper[4717]: I1007 14:48:07.033511 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kmm6w"] Oct 07 14:48:07 crc kubenswrapper[4717]: I1007 14:48:07.918338 4717 generic.go:334] "Generic (PLEG): container finished" podID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerID="abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453" exitCode=0 Oct 07 14:48:07 crc kubenswrapper[4717]: I1007 14:48:07.918585 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmm6w" event={"ID":"d6aee2a2-becc-44fb-8950-a9464e8b518d","Type":"ContainerDied","Data":"abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453"} Oct 07 14:48:07 crc kubenswrapper[4717]: I1007 14:48:07.918651 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmm6w" event={"ID":"d6aee2a2-becc-44fb-8950-a9464e8b518d","Type":"ContainerStarted","Data":"97a8ef2155a61001d24afc4730a0e472eadd385826569d7cbfd5ea2f9a92560c"} Oct 07 14:48:09 crc kubenswrapper[4717]: I1007 14:48:09.951484 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpvh" event={"ID":"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267","Type":"ContainerStarted","Data":"b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0"} Oct 07 14:48:09 crc kubenswrapper[4717]: I1007 14:48:09.962717 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmm6w" event={"ID":"d6aee2a2-becc-44fb-8950-a9464e8b518d","Type":"ContainerStarted","Data":"da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd"} Oct 07 14:48:10 crc kubenswrapper[4717]: I1007 14:48:10.990330 4717 generic.go:334] "Generic (PLEG): container finished" podID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerID="b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0" exitCode=0 Oct 07 14:48:10 crc kubenswrapper[4717]: I1007 14:48:10.991694 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpvh" event={"ID":"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267","Type":"ContainerDied","Data":"b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0"} Oct 07 14:48:13 crc kubenswrapper[4717]: I1007 14:48:13.012478 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpvh" event={"ID":"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267","Type":"ContainerStarted","Data":"1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c"} Oct 07 14:48:13 crc kubenswrapper[4717]: I1007 14:48:13.039851 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tzpvh" podStartSLOduration=2.284665714 podStartE2EDuration="8.039823884s" podCreationTimestamp="2025-10-07 14:48:05 +0000 UTC" firstStartedPulling="2025-10-07 14:48:06.909546014 +0000 UTC m=+3268.737471806" lastFinishedPulling="2025-10-07 14:48:12.664704184 +0000 UTC m=+3274.492629976" observedRunningTime="2025-10-07 14:48:13.031718273 +0000 UTC m=+3274.859644065" watchObservedRunningTime="2025-10-07 14:48:13.039823884 +0000 UTC m=+3274.867749676" Oct 07 14:48:15 crc kubenswrapper[4717]: I1007 14:48:15.510966 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:15 crc kubenswrapper[4717]: I1007 14:48:15.511557 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:16 crc kubenswrapper[4717]: I1007 14:48:16.044078 4717 generic.go:334] "Generic (PLEG): container finished" podID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerID="da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd" exitCode=0 Oct 07 14:48:16 crc kubenswrapper[4717]: I1007 14:48:16.044151 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmm6w" event={"ID":"d6aee2a2-becc-44fb-8950-a9464e8b518d","Type":"ContainerDied","Data":"da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd"} Oct 07 14:48:16 crc kubenswrapper[4717]: I1007 14:48:16.571243 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-tzpvh" podUID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerName="registry-server" probeResult="failure" output=< Oct 07 14:48:16 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Oct 07 14:48:16 crc kubenswrapper[4717]: > Oct 07 14:48:19 crc kubenswrapper[4717]: I1007 14:48:19.072144 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmm6w" event={"ID":"d6aee2a2-becc-44fb-8950-a9464e8b518d","Type":"ContainerStarted","Data":"9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09"} Oct 07 14:48:19 crc kubenswrapper[4717]: I1007 14:48:19.103405 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kmm6w" podStartSLOduration=2.569549101 podStartE2EDuration="13.103385176s" podCreationTimestamp="2025-10-07 14:48:06 +0000 UTC" firstStartedPulling="2025-10-07 14:48:07.920682643 +0000 UTC m=+3269.748608435" lastFinishedPulling="2025-10-07 14:48:18.454518718 +0000 UTC m=+3280.282444510" observedRunningTime="2025-10-07 14:48:19.093082656 +0000 UTC m=+3280.921008458" watchObservedRunningTime="2025-10-07 14:48:19.103385176 +0000 UTC m=+3280.931310968" Oct 07 14:48:25 crc kubenswrapper[4717]: I1007 14:48:25.560917 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:25 crc kubenswrapper[4717]: I1007 14:48:25.611426 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:25 crc kubenswrapper[4717]: I1007 14:48:25.794974 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpvh"] Oct 07 14:48:26 crc kubenswrapper[4717]: I1007 14:48:26.526458 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:26 crc kubenswrapper[4717]: I1007 14:48:26.526764 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:26 crc kubenswrapper[4717]: I1007 14:48:26.577808 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:27 crc kubenswrapper[4717]: I1007 14:48:27.169259 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tzpvh" podUID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerName="registry-server" containerID="cri-o://1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c" gracePeriod=2 Oct 07 14:48:27 crc kubenswrapper[4717]: I1007 14:48:27.217932 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:27 crc kubenswrapper[4717]: I1007 14:48:27.957652 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:27 crc kubenswrapper[4717]: I1007 14:48:27.998397 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-catalog-content\") pod \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " Oct 07 14:48:27 crc kubenswrapper[4717]: I1007 14:48:27.998573 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-utilities\") pod \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " Oct 07 14:48:27 crc kubenswrapper[4717]: I1007 14:48:27.998648 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9wn9\" (UniqueName: \"kubernetes.io/projected/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-kube-api-access-k9wn9\") pod \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\" (UID: \"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267\") " Oct 07 14:48:27 crc kubenswrapper[4717]: I1007 14:48:27.999172 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-utilities" (OuterVolumeSpecName: "utilities") pod "93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" (UID: "93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.001120 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.004686 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-kube-api-access-k9wn9" (OuterVolumeSpecName: "kube-api-access-k9wn9") pod "93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" (UID: "93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267"). InnerVolumeSpecName "kube-api-access-k9wn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.011153 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" (UID: "93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.102678 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.102742 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9wn9\" (UniqueName: \"kubernetes.io/projected/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267-kube-api-access-k9wn9\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.181490 4717 generic.go:334] "Generic (PLEG): container finished" podID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerID="1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c" exitCode=0 Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.182437 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzpvh" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.183140 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpvh" event={"ID":"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267","Type":"ContainerDied","Data":"1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c"} Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.183215 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzpvh" event={"ID":"93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267","Type":"ContainerDied","Data":"ff65cea4839ab90489e20152b881e25d420930db94da74035f1f49b9d0daace2"} Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.183257 4717 scope.go:117] "RemoveContainer" containerID="1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.206715 4717 scope.go:117] "RemoveContainer" containerID="b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.221652 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpvh"] Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.231829 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzpvh"] Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.234741 4717 scope.go:117] "RemoveContainer" containerID="d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.290793 4717 scope.go:117] "RemoveContainer" containerID="1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c" Oct 07 14:48:28 crc kubenswrapper[4717]: E1007 14:48:28.291243 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c\": container with ID starting with 1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c not found: ID does not exist" containerID="1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.291288 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c"} err="failed to get container status \"1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c\": rpc error: code = NotFound desc = could not find container \"1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c\": container with ID starting with 1517633ec50f317eca7f30d254dc6aac9e72ddee194a75ec3e109f585a48844c not found: ID does not exist" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.291316 4717 scope.go:117] "RemoveContainer" containerID="b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0" Oct 07 14:48:28 crc kubenswrapper[4717]: E1007 14:48:28.291623 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0\": container with ID starting with b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0 not found: ID does not exist" containerID="b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.291648 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0"} err="failed to get container status \"b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0\": rpc error: code = NotFound desc = could not find container \"b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0\": container with ID starting with b278594f38518cc5c282d2dd426123512d3336b30e9449baf86f80c059368fd0 not found: ID does not exist" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.291663 4717 scope.go:117] "RemoveContainer" containerID="d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae" Oct 07 14:48:28 crc kubenswrapper[4717]: E1007 14:48:28.291852 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae\": container with ID starting with d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae not found: ID does not exist" containerID="d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.291871 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae"} err="failed to get container status \"d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae\": rpc error: code = NotFound desc = could not find container \"d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae\": container with ID starting with d8d42d9e72779a4e66e5d27c7c1f1b019daa2a98576f1fe0f76a4b10c28066ae not found: ID does not exist" Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.397446 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kmm6w"] Oct 07 14:48:28 crc kubenswrapper[4717]: I1007 14:48:28.879591 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" path="/var/lib/kubelet/pods/93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267/volumes" Oct 07 14:48:29 crc kubenswrapper[4717]: I1007 14:48:29.192344 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kmm6w" podUID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerName="registry-server" containerID="cri-o://9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09" gracePeriod=2 Oct 07 14:48:29 crc kubenswrapper[4717]: I1007 14:48:29.800693 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:29 crc kubenswrapper[4717]: I1007 14:48:29.835501 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2jlv\" (UniqueName: \"kubernetes.io/projected/d6aee2a2-becc-44fb-8950-a9464e8b518d-kube-api-access-v2jlv\") pod \"d6aee2a2-becc-44fb-8950-a9464e8b518d\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " Oct 07 14:48:29 crc kubenswrapper[4717]: I1007 14:48:29.835586 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-catalog-content\") pod \"d6aee2a2-becc-44fb-8950-a9464e8b518d\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " Oct 07 14:48:29 crc kubenswrapper[4717]: I1007 14:48:29.835698 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-utilities\") pod \"d6aee2a2-becc-44fb-8950-a9464e8b518d\" (UID: \"d6aee2a2-becc-44fb-8950-a9464e8b518d\") " Oct 07 14:48:29 crc kubenswrapper[4717]: I1007 14:48:29.836822 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-utilities" (OuterVolumeSpecName: "utilities") pod "d6aee2a2-becc-44fb-8950-a9464e8b518d" (UID: "d6aee2a2-becc-44fb-8950-a9464e8b518d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:48:29 crc kubenswrapper[4717]: I1007 14:48:29.840987 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6aee2a2-becc-44fb-8950-a9464e8b518d-kube-api-access-v2jlv" (OuterVolumeSpecName: "kube-api-access-v2jlv") pod "d6aee2a2-becc-44fb-8950-a9464e8b518d" (UID: "d6aee2a2-becc-44fb-8950-a9464e8b518d"). InnerVolumeSpecName "kube-api-access-v2jlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:48:29 crc kubenswrapper[4717]: I1007 14:48:29.939842 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2jlv\" (UniqueName: \"kubernetes.io/projected/d6aee2a2-becc-44fb-8950-a9464e8b518d-kube-api-access-v2jlv\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:29 crc kubenswrapper[4717]: I1007 14:48:29.939879 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:29 crc kubenswrapper[4717]: I1007 14:48:29.940375 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6aee2a2-becc-44fb-8950-a9464e8b518d" (UID: "d6aee2a2-becc-44fb-8950-a9464e8b518d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.043326 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6aee2a2-becc-44fb-8950-a9464e8b518d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.204180 4717 generic.go:334] "Generic (PLEG): container finished" podID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerID="9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09" exitCode=0 Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.204235 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kmm6w" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.204253 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmm6w" event={"ID":"d6aee2a2-becc-44fb-8950-a9464e8b518d","Type":"ContainerDied","Data":"9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09"} Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.204652 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kmm6w" event={"ID":"d6aee2a2-becc-44fb-8950-a9464e8b518d","Type":"ContainerDied","Data":"97a8ef2155a61001d24afc4730a0e472eadd385826569d7cbfd5ea2f9a92560c"} Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.204673 4717 scope.go:117] "RemoveContainer" containerID="9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.237190 4717 scope.go:117] "RemoveContainer" containerID="da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.248893 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kmm6w"] Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.260096 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kmm6w"] Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.268162 4717 scope.go:117] "RemoveContainer" containerID="abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.327120 4717 scope.go:117] "RemoveContainer" containerID="9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09" Oct 07 14:48:30 crc kubenswrapper[4717]: E1007 14:48:30.327540 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09\": container with ID starting with 9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09 not found: ID does not exist" containerID="9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.327576 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09"} err="failed to get container status \"9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09\": rpc error: code = NotFound desc = could not find container \"9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09\": container with ID starting with 9cd5e88fd86def77ff4579f3ff7330859afed9906d966565fa01863efcf32d09 not found: ID does not exist" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.327601 4717 scope.go:117] "RemoveContainer" containerID="da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd" Oct 07 14:48:30 crc kubenswrapper[4717]: E1007 14:48:30.328099 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd\": container with ID starting with da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd not found: ID does not exist" containerID="da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.328149 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd"} err="failed to get container status \"da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd\": rpc error: code = NotFound desc = could not find container \"da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd\": container with ID starting with da6df192dd3181f25ae298a5901b710eefe6c81b29b8a85dfba5b1a605e314bd not found: ID does not exist" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.328168 4717 scope.go:117] "RemoveContainer" containerID="abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453" Oct 07 14:48:30 crc kubenswrapper[4717]: E1007 14:48:30.334210 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453\": container with ID starting with abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453 not found: ID does not exist" containerID="abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.334248 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453"} err="failed to get container status \"abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453\": rpc error: code = NotFound desc = could not find container \"abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453\": container with ID starting with abff0073dc3cd5f8e550ad798eaba1ce20e489e0f2cfa970fabcc016ae3c0453 not found: ID does not exist" Oct 07 14:48:30 crc kubenswrapper[4717]: I1007 14:48:30.885468 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6aee2a2-becc-44fb-8950-a9464e8b518d" path="/var/lib/kubelet/pods/d6aee2a2-becc-44fb-8950-a9464e8b518d/volumes" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.481836 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ld24c"] Oct 07 14:48:56 crc kubenswrapper[4717]: E1007 14:48:56.482770 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerName="extract-utilities" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.482783 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerName="extract-utilities" Oct 07 14:48:56 crc kubenswrapper[4717]: E1007 14:48:56.482803 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerName="extract-utilities" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.482809 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerName="extract-utilities" Oct 07 14:48:56 crc kubenswrapper[4717]: E1007 14:48:56.482819 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerName="registry-server" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.482825 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerName="registry-server" Oct 07 14:48:56 crc kubenswrapper[4717]: E1007 14:48:56.482834 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerName="extract-content" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.482839 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerName="extract-content" Oct 07 14:48:56 crc kubenswrapper[4717]: E1007 14:48:56.482850 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerName="extract-content" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.482857 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerName="extract-content" Oct 07 14:48:56 crc kubenswrapper[4717]: E1007 14:48:56.482871 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerName="registry-server" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.482877 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerName="registry-server" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.483111 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="93bb1fb8-b7fa-4d1b-9c5e-d0691d06e267" containerName="registry-server" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.483124 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6aee2a2-becc-44fb-8950-a9464e8b518d" containerName="registry-server" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.490350 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.494801 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ld24c"] Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.583608 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6jp9\" (UniqueName: \"kubernetes.io/projected/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-kube-api-access-s6jp9\") pod \"certified-operators-ld24c\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.583682 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-catalog-content\") pod \"certified-operators-ld24c\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.583995 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-utilities\") pod \"certified-operators-ld24c\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.685629 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6jp9\" (UniqueName: \"kubernetes.io/projected/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-kube-api-access-s6jp9\") pod \"certified-operators-ld24c\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.685950 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-catalog-content\") pod \"certified-operators-ld24c\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.686064 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-utilities\") pod \"certified-operators-ld24c\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.686503 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-catalog-content\") pod \"certified-operators-ld24c\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.686579 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-utilities\") pod \"certified-operators-ld24c\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.709948 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6jp9\" (UniqueName: \"kubernetes.io/projected/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-kube-api-access-s6jp9\") pod \"certified-operators-ld24c\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:56 crc kubenswrapper[4717]: I1007 14:48:56.834430 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:48:57 crc kubenswrapper[4717]: I1007 14:48:57.375500 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ld24c"] Oct 07 14:48:57 crc kubenswrapper[4717]: I1007 14:48:57.428091 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ld24c" event={"ID":"b9fb700c-90c7-4f56-a9a7-e8fc9878f873","Type":"ContainerStarted","Data":"6a75b946c727cc632cd730741279f0022af17008d31353f45406eb511e07aa8a"} Oct 07 14:48:58 crc kubenswrapper[4717]: I1007 14:48:58.439848 4717 generic.go:334] "Generic (PLEG): container finished" podID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerID="f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3" exitCode=0 Oct 07 14:48:58 crc kubenswrapper[4717]: I1007 14:48:58.439929 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ld24c" event={"ID":"b9fb700c-90c7-4f56-a9a7-e8fc9878f873","Type":"ContainerDied","Data":"f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3"} Oct 07 14:48:59 crc kubenswrapper[4717]: I1007 14:48:59.451665 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ld24c" event={"ID":"b9fb700c-90c7-4f56-a9a7-e8fc9878f873","Type":"ContainerStarted","Data":"6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449"} Oct 07 14:49:01 crc kubenswrapper[4717]: I1007 14:49:01.470369 4717 generic.go:334] "Generic (PLEG): container finished" podID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerID="6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449" exitCode=0 Oct 07 14:49:01 crc kubenswrapper[4717]: I1007 14:49:01.470457 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ld24c" event={"ID":"b9fb700c-90c7-4f56-a9a7-e8fc9878f873","Type":"ContainerDied","Data":"6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449"} Oct 07 14:49:02 crc kubenswrapper[4717]: I1007 14:49:02.483297 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ld24c" event={"ID":"b9fb700c-90c7-4f56-a9a7-e8fc9878f873","Type":"ContainerStarted","Data":"f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea"} Oct 07 14:49:02 crc kubenswrapper[4717]: I1007 14:49:02.505444 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ld24c" podStartSLOduration=2.865384439 podStartE2EDuration="6.505421923s" podCreationTimestamp="2025-10-07 14:48:56 +0000 UTC" firstStartedPulling="2025-10-07 14:48:58.442066366 +0000 UTC m=+3320.269992158" lastFinishedPulling="2025-10-07 14:49:02.08210385 +0000 UTC m=+3323.910029642" observedRunningTime="2025-10-07 14:49:02.498784642 +0000 UTC m=+3324.326710434" watchObservedRunningTime="2025-10-07 14:49:02.505421923 +0000 UTC m=+3324.333347715" Oct 07 14:49:06 crc kubenswrapper[4717]: I1007 14:49:06.834683 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:49:06 crc kubenswrapper[4717]: I1007 14:49:06.837093 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:49:06 crc kubenswrapper[4717]: I1007 14:49:06.900741 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:49:07 crc kubenswrapper[4717]: I1007 14:49:07.589468 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:49:07 crc kubenswrapper[4717]: I1007 14:49:07.641811 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ld24c"] Oct 07 14:49:09 crc kubenswrapper[4717]: I1007 14:49:09.552641 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ld24c" podUID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerName="registry-server" containerID="cri-o://f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea" gracePeriod=2 Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.322655 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.468020 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6jp9\" (UniqueName: \"kubernetes.io/projected/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-kube-api-access-s6jp9\") pod \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.468232 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-utilities\") pod \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.468378 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-catalog-content\") pod \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\" (UID: \"b9fb700c-90c7-4f56-a9a7-e8fc9878f873\") " Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.481045 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-kube-api-access-s6jp9" (OuterVolumeSpecName: "kube-api-access-s6jp9") pod "b9fb700c-90c7-4f56-a9a7-e8fc9878f873" (UID: "b9fb700c-90c7-4f56-a9a7-e8fc9878f873"). InnerVolumeSpecName "kube-api-access-s6jp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.487913 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-utilities" (OuterVolumeSpecName: "utilities") pod "b9fb700c-90c7-4f56-a9a7-e8fc9878f873" (UID: "b9fb700c-90c7-4f56-a9a7-e8fc9878f873"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.519478 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9fb700c-90c7-4f56-a9a7-e8fc9878f873" (UID: "b9fb700c-90c7-4f56-a9a7-e8fc9878f873"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.563386 4717 generic.go:334] "Generic (PLEG): container finished" podID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerID="f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea" exitCode=0 Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.563427 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ld24c" event={"ID":"b9fb700c-90c7-4f56-a9a7-e8fc9878f873","Type":"ContainerDied","Data":"f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea"} Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.563471 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ld24c" event={"ID":"b9fb700c-90c7-4f56-a9a7-e8fc9878f873","Type":"ContainerDied","Data":"6a75b946c727cc632cd730741279f0022af17008d31353f45406eb511e07aa8a"} Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.563494 4717 scope.go:117] "RemoveContainer" containerID="f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.563577 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ld24c" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.571329 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.571517 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6jp9\" (UniqueName: \"kubernetes.io/projected/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-kube-api-access-s6jp9\") on node \"crc\" DevicePath \"\"" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.571865 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9fb700c-90c7-4f56-a9a7-e8fc9878f873-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.598607 4717 scope.go:117] "RemoveContainer" containerID="6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.603458 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ld24c"] Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.620667 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ld24c"] Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.621961 4717 scope.go:117] "RemoveContainer" containerID="f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.682440 4717 scope.go:117] "RemoveContainer" containerID="f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea" Oct 07 14:49:10 crc kubenswrapper[4717]: E1007 14:49:10.683194 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea\": container with ID starting with f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea not found: ID does not exist" containerID="f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.683257 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea"} err="failed to get container status \"f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea\": rpc error: code = NotFound desc = could not find container \"f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea\": container with ID starting with f2917da71504affc774ab626205a3b5a34430beb5811b67f931d9c1385cee4ea not found: ID does not exist" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.683309 4717 scope.go:117] "RemoveContainer" containerID="6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449" Oct 07 14:49:10 crc kubenswrapper[4717]: E1007 14:49:10.683784 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449\": container with ID starting with 6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449 not found: ID does not exist" containerID="6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.683827 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449"} err="failed to get container status \"6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449\": rpc error: code = NotFound desc = could not find container \"6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449\": container with ID starting with 6ad5a5f5172213e33bf39239397258d52d05f2178b773a74c7df386358dfe449 not found: ID does not exist" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.683860 4717 scope.go:117] "RemoveContainer" containerID="f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3" Oct 07 14:49:10 crc kubenswrapper[4717]: E1007 14:49:10.684203 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3\": container with ID starting with f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3 not found: ID does not exist" containerID="f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.684255 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3"} err="failed to get container status \"f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3\": rpc error: code = NotFound desc = could not find container \"f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3\": container with ID starting with f4be475fbd0730ca72aa26fa30d4aa15feef2b5b771f05e718271d1b43f771d3 not found: ID does not exist" Oct 07 14:49:10 crc kubenswrapper[4717]: I1007 14:49:10.878683 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" path="/var/lib/kubelet/pods/b9fb700c-90c7-4f56-a9a7-e8fc9878f873/volumes" Oct 07 14:49:31 crc kubenswrapper[4717]: I1007 14:49:31.610104 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:49:31 crc kubenswrapper[4717]: I1007 14:49:31.610652 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:50:01 crc kubenswrapper[4717]: I1007 14:50:01.609747 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:50:01 crc kubenswrapper[4717]: I1007 14:50:01.610317 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.394973 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fq6z7"] Oct 07 14:50:27 crc kubenswrapper[4717]: E1007 14:50:27.396709 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerName="extract-content" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.396732 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerName="extract-content" Oct 07 14:50:27 crc kubenswrapper[4717]: E1007 14:50:27.396771 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerName="registry-server" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.396782 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerName="registry-server" Oct 07 14:50:27 crc kubenswrapper[4717]: E1007 14:50:27.396803 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerName="extract-utilities" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.396813 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerName="extract-utilities" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.397158 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9fb700c-90c7-4f56-a9a7-e8fc9878f873" containerName="registry-server" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.398982 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.427722 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fq6z7"] Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.513258 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qffh\" (UniqueName: \"kubernetes.io/projected/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-kube-api-access-2qffh\") pod \"community-operators-fq6z7\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.513603 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-utilities\") pod \"community-operators-fq6z7\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.513686 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-catalog-content\") pod \"community-operators-fq6z7\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.616486 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-catalog-content\") pod \"community-operators-fq6z7\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.616559 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-utilities\") pod \"community-operators-fq6z7\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.616686 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qffh\" (UniqueName: \"kubernetes.io/projected/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-kube-api-access-2qffh\") pod \"community-operators-fq6z7\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.617073 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-catalog-content\") pod \"community-operators-fq6z7\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.617293 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-utilities\") pod \"community-operators-fq6z7\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.640699 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qffh\" (UniqueName: \"kubernetes.io/projected/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-kube-api-access-2qffh\") pod \"community-operators-fq6z7\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:27 crc kubenswrapper[4717]: I1007 14:50:27.721235 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:28 crc kubenswrapper[4717]: I1007 14:50:28.392117 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fq6z7"] Oct 07 14:50:29 crc kubenswrapper[4717]: I1007 14:50:29.296696 4717 generic.go:334] "Generic (PLEG): container finished" podID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerID="7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183" exitCode=0 Oct 07 14:50:29 crc kubenswrapper[4717]: I1007 14:50:29.296761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fq6z7" event={"ID":"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4","Type":"ContainerDied","Data":"7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183"} Oct 07 14:50:29 crc kubenswrapper[4717]: I1007 14:50:29.297086 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fq6z7" event={"ID":"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4","Type":"ContainerStarted","Data":"1beb721336192567a1e0b1f050cb6553cc74f26fd1cfa0660a2e59585861677a"} Oct 07 14:50:31 crc kubenswrapper[4717]: I1007 14:50:31.609714 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:50:31 crc kubenswrapper[4717]: I1007 14:50:31.610495 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:50:31 crc kubenswrapper[4717]: I1007 14:50:31.610552 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:50:31 crc kubenswrapper[4717]: I1007 14:50:31.611428 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41ca09c5ec0884558ea44f72b072c002159c58be55dba17ea5b1e22e93ec5d31"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:50:31 crc kubenswrapper[4717]: I1007 14:50:31.611481 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://41ca09c5ec0884558ea44f72b072c002159c58be55dba17ea5b1e22e93ec5d31" gracePeriod=600 Oct 07 14:50:32 crc kubenswrapper[4717]: I1007 14:50:32.327916 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="41ca09c5ec0884558ea44f72b072c002159c58be55dba17ea5b1e22e93ec5d31" exitCode=0 Oct 07 14:50:32 crc kubenswrapper[4717]: I1007 14:50:32.327955 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"41ca09c5ec0884558ea44f72b072c002159c58be55dba17ea5b1e22e93ec5d31"} Oct 07 14:50:32 crc kubenswrapper[4717]: I1007 14:50:32.328406 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf"} Oct 07 14:50:32 crc kubenswrapper[4717]: I1007 14:50:32.328424 4717 scope.go:117] "RemoveContainer" containerID="a3f3c39de40eaca39efb8cedf105da342d5b03c339ea46817f65255daeca0b2e" Oct 07 14:50:32 crc kubenswrapper[4717]: I1007 14:50:32.333593 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fq6z7" event={"ID":"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4","Type":"ContainerStarted","Data":"b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215"} Oct 07 14:50:36 crc kubenswrapper[4717]: I1007 14:50:36.396973 4717 generic.go:334] "Generic (PLEG): container finished" podID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerID="b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215" exitCode=0 Oct 07 14:50:36 crc kubenswrapper[4717]: I1007 14:50:36.397064 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fq6z7" event={"ID":"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4","Type":"ContainerDied","Data":"b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215"} Oct 07 14:50:37 crc kubenswrapper[4717]: I1007 14:50:37.413200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fq6z7" event={"ID":"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4","Type":"ContainerStarted","Data":"57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f"} Oct 07 14:50:37 crc kubenswrapper[4717]: I1007 14:50:37.444679 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fq6z7" podStartSLOduration=2.683472953 podStartE2EDuration="10.444655259s" podCreationTimestamp="2025-10-07 14:50:27 +0000 UTC" firstStartedPulling="2025-10-07 14:50:29.298965928 +0000 UTC m=+3411.126891720" lastFinishedPulling="2025-10-07 14:50:37.060148234 +0000 UTC m=+3418.888074026" observedRunningTime="2025-10-07 14:50:37.433444532 +0000 UTC m=+3419.261370344" watchObservedRunningTime="2025-10-07 14:50:37.444655259 +0000 UTC m=+3419.272581061" Oct 07 14:50:37 crc kubenswrapper[4717]: I1007 14:50:37.721614 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:37 crc kubenswrapper[4717]: I1007 14:50:37.721670 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:38 crc kubenswrapper[4717]: I1007 14:50:38.777667 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fq6z7" podUID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerName="registry-server" probeResult="failure" output=< Oct 07 14:50:38 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Oct 07 14:50:38 crc kubenswrapper[4717]: > Oct 07 14:50:47 crc kubenswrapper[4717]: I1007 14:50:47.772581 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:47 crc kubenswrapper[4717]: I1007 14:50:47.826430 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:48 crc kubenswrapper[4717]: I1007 14:50:48.027361 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fq6z7"] Oct 07 14:50:49 crc kubenswrapper[4717]: I1007 14:50:49.548687 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fq6z7" podUID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerName="registry-server" containerID="cri-o://57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f" gracePeriod=2 Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.540678 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.565909 4717 generic.go:334] "Generic (PLEG): container finished" podID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerID="57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f" exitCode=0 Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.566001 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fq6z7" event={"ID":"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4","Type":"ContainerDied","Data":"57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f"} Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.566069 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fq6z7" event={"ID":"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4","Type":"ContainerDied","Data":"1beb721336192567a1e0b1f050cb6553cc74f26fd1cfa0660a2e59585861677a"} Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.566092 4717 scope.go:117] "RemoveContainer" containerID="57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.566365 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fq6z7" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.599424 4717 scope.go:117] "RemoveContainer" containerID="b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.635563 4717 scope.go:117] "RemoveContainer" containerID="7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.674273 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qffh\" (UniqueName: \"kubernetes.io/projected/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-kube-api-access-2qffh\") pod \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.675120 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-catalog-content\") pod \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.675639 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-utilities\") pod \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\" (UID: \"e49e18e7-eafe-4bb9-a353-b3622d7c9ac4\") " Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.679342 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-utilities" (OuterVolumeSpecName: "utilities") pod "e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" (UID: "e49e18e7-eafe-4bb9-a353-b3622d7c9ac4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.684962 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-kube-api-access-2qffh" (OuterVolumeSpecName: "kube-api-access-2qffh") pod "e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" (UID: "e49e18e7-eafe-4bb9-a353-b3622d7c9ac4"). InnerVolumeSpecName "kube-api-access-2qffh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.705635 4717 scope.go:117] "RemoveContainer" containerID="57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f" Oct 07 14:50:50 crc kubenswrapper[4717]: E1007 14:50:50.710305 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f\": container with ID starting with 57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f not found: ID does not exist" containerID="57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.710580 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f"} err="failed to get container status \"57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f\": rpc error: code = NotFound desc = could not find container \"57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f\": container with ID starting with 57b6ac5f47b21b6d35e0933a3c3d27ab2aeb1430601ed18ed860ae42c402226f not found: ID does not exist" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.710683 4717 scope.go:117] "RemoveContainer" containerID="b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215" Oct 07 14:50:50 crc kubenswrapper[4717]: E1007 14:50:50.712667 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215\": container with ID starting with b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215 not found: ID does not exist" containerID="b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.713109 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215"} err="failed to get container status \"b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215\": rpc error: code = NotFound desc = could not find container \"b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215\": container with ID starting with b84511d2a8d93e581c9da5c950f4f314bee380e5bf3668ada148e58730577215 not found: ID does not exist" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.713236 4717 scope.go:117] "RemoveContainer" containerID="7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183" Oct 07 14:50:50 crc kubenswrapper[4717]: E1007 14:50:50.713744 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183\": container with ID starting with 7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183 not found: ID does not exist" containerID="7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.713813 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183"} err="failed to get container status \"7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183\": rpc error: code = NotFound desc = could not find container \"7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183\": container with ID starting with 7eed7032d7bcc3e7e1920c1f11c0657add3639024961b467c6897322b387f183 not found: ID does not exist" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.753593 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" (UID: "e49e18e7-eafe-4bb9-a353-b3622d7c9ac4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.778547 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.778580 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.778592 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qffh\" (UniqueName: \"kubernetes.io/projected/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4-kube-api-access-2qffh\") on node \"crc\" DevicePath \"\"" Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.918240 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fq6z7"] Oct 07 14:50:50 crc kubenswrapper[4717]: I1007 14:50:50.929084 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fq6z7"] Oct 07 14:50:52 crc kubenswrapper[4717]: I1007 14:50:52.883818 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" path="/var/lib/kubelet/pods/e49e18e7-eafe-4bb9-a353-b3622d7c9ac4/volumes" Oct 07 14:53:01 crc kubenswrapper[4717]: I1007 14:53:01.609690 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:53:01 crc kubenswrapper[4717]: I1007 14:53:01.610303 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:53:31 crc kubenswrapper[4717]: I1007 14:53:31.609690 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:53:31 crc kubenswrapper[4717]: I1007 14:53:31.610296 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:54:01 crc kubenswrapper[4717]: I1007 14:54:01.610297 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 14:54:01 crc kubenswrapper[4717]: I1007 14:54:01.610802 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 14:54:01 crc kubenswrapper[4717]: I1007 14:54:01.610849 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 14:54:01 crc kubenswrapper[4717]: I1007 14:54:01.611669 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 14:54:01 crc kubenswrapper[4717]: I1007 14:54:01.611722 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" gracePeriod=600 Oct 07 14:54:01 crc kubenswrapper[4717]: E1007 14:54:01.746460 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:54:02 crc kubenswrapper[4717]: I1007 14:54:02.515990 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" exitCode=0 Oct 07 14:54:02 crc kubenswrapper[4717]: I1007 14:54:02.516331 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf"} Oct 07 14:54:02 crc kubenswrapper[4717]: I1007 14:54:02.516369 4717 scope.go:117] "RemoveContainer" containerID="41ca09c5ec0884558ea44f72b072c002159c58be55dba17ea5b1e22e93ec5d31" Oct 07 14:54:02 crc kubenswrapper[4717]: I1007 14:54:02.517262 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:54:02 crc kubenswrapper[4717]: E1007 14:54:02.517573 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:54:16 crc kubenswrapper[4717]: I1007 14:54:16.868611 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:54:16 crc kubenswrapper[4717]: E1007 14:54:16.869833 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:54:30 crc kubenswrapper[4717]: I1007 14:54:30.869307 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:54:30 crc kubenswrapper[4717]: E1007 14:54:30.870419 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:54:44 crc kubenswrapper[4717]: I1007 14:54:44.868323 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:54:44 crc kubenswrapper[4717]: E1007 14:54:44.869237 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:54:57 crc kubenswrapper[4717]: I1007 14:54:57.868195 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:54:57 crc kubenswrapper[4717]: E1007 14:54:57.868898 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:55:09 crc kubenswrapper[4717]: I1007 14:55:09.869755 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:55:09 crc kubenswrapper[4717]: E1007 14:55:09.871072 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:55:20 crc kubenswrapper[4717]: I1007 14:55:20.868185 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:55:20 crc kubenswrapper[4717]: E1007 14:55:20.869237 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:55:34 crc kubenswrapper[4717]: I1007 14:55:34.868220 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:55:34 crc kubenswrapper[4717]: E1007 14:55:34.869767 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:55:45 crc kubenswrapper[4717]: I1007 14:55:45.868306 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:55:45 crc kubenswrapper[4717]: E1007 14:55:45.868995 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:55:58 crc kubenswrapper[4717]: I1007 14:55:58.875988 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:55:58 crc kubenswrapper[4717]: E1007 14:55:58.876842 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:56:11 crc kubenswrapper[4717]: I1007 14:56:11.869153 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:56:11 crc kubenswrapper[4717]: E1007 14:56:11.869890 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:56:22 crc kubenswrapper[4717]: I1007 14:56:22.868910 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:56:22 crc kubenswrapper[4717]: E1007 14:56:22.869918 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:56:34 crc kubenswrapper[4717]: I1007 14:56:34.869376 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:56:34 crc kubenswrapper[4717]: E1007 14:56:34.870152 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:56:49 crc kubenswrapper[4717]: I1007 14:56:49.868310 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:56:49 crc kubenswrapper[4717]: E1007 14:56:49.869353 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:57:04 crc kubenswrapper[4717]: I1007 14:57:04.869144 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:57:04 crc kubenswrapper[4717]: E1007 14:57:04.869933 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:57:18 crc kubenswrapper[4717]: I1007 14:57:18.876282 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:57:18 crc kubenswrapper[4717]: E1007 14:57:18.877138 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:57:30 crc kubenswrapper[4717]: I1007 14:57:30.892428 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:57:30 crc kubenswrapper[4717]: E1007 14:57:30.897883 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:57:43 crc kubenswrapper[4717]: I1007 14:57:43.868924 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:57:43 crc kubenswrapper[4717]: E1007 14:57:43.870913 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:57:55 crc kubenswrapper[4717]: I1007 14:57:55.868828 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:57:55 crc kubenswrapper[4717]: E1007 14:57:55.869911 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:58:10 crc kubenswrapper[4717]: I1007 14:58:10.870548 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:58:10 crc kubenswrapper[4717]: E1007 14:58:10.871432 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:58:23 crc kubenswrapper[4717]: I1007 14:58:23.868665 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:58:23 crc kubenswrapper[4717]: E1007 14:58:23.869470 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.718667 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n8bjb"] Oct 07 14:58:31 crc kubenswrapper[4717]: E1007 14:58:31.719755 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerName="extract-utilities" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.719773 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerName="extract-utilities" Oct 07 14:58:31 crc kubenswrapper[4717]: E1007 14:58:31.719810 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerName="extract-content" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.719819 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerName="extract-content" Oct 07 14:58:31 crc kubenswrapper[4717]: E1007 14:58:31.719845 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerName="registry-server" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.719854 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerName="registry-server" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.720112 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="e49e18e7-eafe-4bb9-a353-b3622d7c9ac4" containerName="registry-server" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.721976 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.732217 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8bjb"] Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.808902 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-catalog-content\") pod \"redhat-operators-n8bjb\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.808990 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nztlb\" (UniqueName: \"kubernetes.io/projected/cd343b81-bbc5-4a1e-b6df-c72433822e1e-kube-api-access-nztlb\") pod \"redhat-operators-n8bjb\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.809196 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-utilities\") pod \"redhat-operators-n8bjb\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.911633 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-utilities\") pod \"redhat-operators-n8bjb\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.911795 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-catalog-content\") pod \"redhat-operators-n8bjb\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.911856 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nztlb\" (UniqueName: \"kubernetes.io/projected/cd343b81-bbc5-4a1e-b6df-c72433822e1e-kube-api-access-nztlb\") pod \"redhat-operators-n8bjb\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.912521 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-catalog-content\") pod \"redhat-operators-n8bjb\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.912542 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-utilities\") pod \"redhat-operators-n8bjb\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:31 crc kubenswrapper[4717]: I1007 14:58:31.944475 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nztlb\" (UniqueName: \"kubernetes.io/projected/cd343b81-bbc5-4a1e-b6df-c72433822e1e-kube-api-access-nztlb\") pod \"redhat-operators-n8bjb\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:32 crc kubenswrapper[4717]: I1007 14:58:32.042660 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:32 crc kubenswrapper[4717]: I1007 14:58:32.612779 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8bjb"] Oct 07 14:58:33 crc kubenswrapper[4717]: I1007 14:58:33.094667 4717 generic.go:334] "Generic (PLEG): container finished" podID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerID="081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb" exitCode=0 Oct 07 14:58:33 crc kubenswrapper[4717]: I1007 14:58:33.094777 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bjb" event={"ID":"cd343b81-bbc5-4a1e-b6df-c72433822e1e","Type":"ContainerDied","Data":"081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb"} Oct 07 14:58:33 crc kubenswrapper[4717]: I1007 14:58:33.095029 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bjb" event={"ID":"cd343b81-bbc5-4a1e-b6df-c72433822e1e","Type":"ContainerStarted","Data":"e4db3b1268782041aa482f2f9d7baf1a83a1f8a9100d31b8c09705ed8126dffe"} Oct 07 14:58:33 crc kubenswrapper[4717]: I1007 14:58:33.096834 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 14:58:35 crc kubenswrapper[4717]: I1007 14:58:35.115958 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bjb" event={"ID":"cd343b81-bbc5-4a1e-b6df-c72433822e1e","Type":"ContainerStarted","Data":"ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649"} Oct 07 14:58:36 crc kubenswrapper[4717]: I1007 14:58:36.868579 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:58:36 crc kubenswrapper[4717]: E1007 14:58:36.868924 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.695715 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpg2"] Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.698503 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.714577 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpg2"] Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.815856 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-catalog-content\") pod \"redhat-marketplace-sxpg2\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.816070 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-utilities\") pod \"redhat-marketplace-sxpg2\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.816098 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4qf2\" (UniqueName: \"kubernetes.io/projected/27604d01-071c-4a2c-80cb-01de8401de57-kube-api-access-z4qf2\") pod \"redhat-marketplace-sxpg2\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.917501 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-catalog-content\") pod \"redhat-marketplace-sxpg2\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.918056 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-catalog-content\") pod \"redhat-marketplace-sxpg2\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.918398 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-utilities\") pod \"redhat-marketplace-sxpg2\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.918510 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4qf2\" (UniqueName: \"kubernetes.io/projected/27604d01-071c-4a2c-80cb-01de8401de57-kube-api-access-z4qf2\") pod \"redhat-marketplace-sxpg2\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.918842 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-utilities\") pod \"redhat-marketplace-sxpg2\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:40 crc kubenswrapper[4717]: I1007 14:58:40.942194 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4qf2\" (UniqueName: \"kubernetes.io/projected/27604d01-071c-4a2c-80cb-01de8401de57-kube-api-access-z4qf2\") pod \"redhat-marketplace-sxpg2\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:41 crc kubenswrapper[4717]: I1007 14:58:41.030622 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:41 crc kubenswrapper[4717]: I1007 14:58:41.209804 4717 generic.go:334] "Generic (PLEG): container finished" podID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerID="ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649" exitCode=0 Oct 07 14:58:41 crc kubenswrapper[4717]: I1007 14:58:41.209871 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bjb" event={"ID":"cd343b81-bbc5-4a1e-b6df-c72433822e1e","Type":"ContainerDied","Data":"ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649"} Oct 07 14:58:41 crc kubenswrapper[4717]: I1007 14:58:41.553968 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpg2"] Oct 07 14:58:42 crc kubenswrapper[4717]: I1007 14:58:42.226781 4717 generic.go:334] "Generic (PLEG): container finished" podID="27604d01-071c-4a2c-80cb-01de8401de57" containerID="3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45" exitCode=0 Oct 07 14:58:42 crc kubenswrapper[4717]: I1007 14:58:42.226873 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpg2" event={"ID":"27604d01-071c-4a2c-80cb-01de8401de57","Type":"ContainerDied","Data":"3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45"} Oct 07 14:58:42 crc kubenswrapper[4717]: I1007 14:58:42.227145 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpg2" event={"ID":"27604d01-071c-4a2c-80cb-01de8401de57","Type":"ContainerStarted","Data":"7581b53fb56a249eae64605281f1808fc3f743f474651d5f715e20891909fc78"} Oct 07 14:58:43 crc kubenswrapper[4717]: I1007 14:58:43.264067 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bjb" event={"ID":"cd343b81-bbc5-4a1e-b6df-c72433822e1e","Type":"ContainerStarted","Data":"01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa"} Oct 07 14:58:43 crc kubenswrapper[4717]: I1007 14:58:43.296418 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n8bjb" podStartSLOduration=3.336630156 podStartE2EDuration="12.296393567s" podCreationTimestamp="2025-10-07 14:58:31 +0000 UTC" firstStartedPulling="2025-10-07 14:58:33.096430599 +0000 UTC m=+3894.924356391" lastFinishedPulling="2025-10-07 14:58:42.05619401 +0000 UTC m=+3903.884119802" observedRunningTime="2025-10-07 14:58:43.284256425 +0000 UTC m=+3905.112182217" watchObservedRunningTime="2025-10-07 14:58:43.296393567 +0000 UTC m=+3905.124319369" Oct 07 14:58:44 crc kubenswrapper[4717]: I1007 14:58:44.277050 4717 generic.go:334] "Generic (PLEG): container finished" podID="27604d01-071c-4a2c-80cb-01de8401de57" containerID="458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05" exitCode=0 Oct 07 14:58:44 crc kubenswrapper[4717]: I1007 14:58:44.277116 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpg2" event={"ID":"27604d01-071c-4a2c-80cb-01de8401de57","Type":"ContainerDied","Data":"458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05"} Oct 07 14:58:45 crc kubenswrapper[4717]: I1007 14:58:45.290753 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpg2" event={"ID":"27604d01-071c-4a2c-80cb-01de8401de57","Type":"ContainerStarted","Data":"f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a"} Oct 07 14:58:45 crc kubenswrapper[4717]: I1007 14:58:45.318538 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sxpg2" podStartSLOduration=2.791717089 podStartE2EDuration="5.318514161s" podCreationTimestamp="2025-10-07 14:58:40 +0000 UTC" firstStartedPulling="2025-10-07 14:58:42.229508568 +0000 UTC m=+3904.057434360" lastFinishedPulling="2025-10-07 14:58:44.75630564 +0000 UTC m=+3906.584231432" observedRunningTime="2025-10-07 14:58:45.314183042 +0000 UTC m=+3907.142108854" watchObservedRunningTime="2025-10-07 14:58:45.318514161 +0000 UTC m=+3907.146439953" Oct 07 14:58:51 crc kubenswrapper[4717]: I1007 14:58:51.031570 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:51 crc kubenswrapper[4717]: I1007 14:58:51.033414 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:51 crc kubenswrapper[4717]: I1007 14:58:51.081330 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:51 crc kubenswrapper[4717]: I1007 14:58:51.407093 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:51 crc kubenswrapper[4717]: I1007 14:58:51.465065 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpg2"] Oct 07 14:58:51 crc kubenswrapper[4717]: I1007 14:58:51.869834 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:58:51 crc kubenswrapper[4717]: E1007 14:58:51.870126 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 14:58:52 crc kubenswrapper[4717]: I1007 14:58:52.044034 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:52 crc kubenswrapper[4717]: I1007 14:58:52.044079 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:58:53 crc kubenswrapper[4717]: I1007 14:58:53.090370 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8bjb" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="registry-server" probeResult="failure" output=< Oct 07 14:58:53 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Oct 07 14:58:53 crc kubenswrapper[4717]: > Oct 07 14:58:53 crc kubenswrapper[4717]: I1007 14:58:53.364233 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sxpg2" podUID="27604d01-071c-4a2c-80cb-01de8401de57" containerName="registry-server" containerID="cri-o://f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a" gracePeriod=2 Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.170673 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.316804 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-catalog-content\") pod \"27604d01-071c-4a2c-80cb-01de8401de57\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.316864 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4qf2\" (UniqueName: \"kubernetes.io/projected/27604d01-071c-4a2c-80cb-01de8401de57-kube-api-access-z4qf2\") pod \"27604d01-071c-4a2c-80cb-01de8401de57\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.317046 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-utilities\") pod \"27604d01-071c-4a2c-80cb-01de8401de57\" (UID: \"27604d01-071c-4a2c-80cb-01de8401de57\") " Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.318667 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-utilities" (OuterVolumeSpecName: "utilities") pod "27604d01-071c-4a2c-80cb-01de8401de57" (UID: "27604d01-071c-4a2c-80cb-01de8401de57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.326877 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27604d01-071c-4a2c-80cb-01de8401de57-kube-api-access-z4qf2" (OuterVolumeSpecName: "kube-api-access-z4qf2") pod "27604d01-071c-4a2c-80cb-01de8401de57" (UID: "27604d01-071c-4a2c-80cb-01de8401de57"). InnerVolumeSpecName "kube-api-access-z4qf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.333766 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "27604d01-071c-4a2c-80cb-01de8401de57" (UID: "27604d01-071c-4a2c-80cb-01de8401de57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.375063 4717 generic.go:334] "Generic (PLEG): container finished" podID="27604d01-071c-4a2c-80cb-01de8401de57" containerID="f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a" exitCode=0 Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.375109 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpg2" event={"ID":"27604d01-071c-4a2c-80cb-01de8401de57","Type":"ContainerDied","Data":"f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a"} Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.375156 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sxpg2" event={"ID":"27604d01-071c-4a2c-80cb-01de8401de57","Type":"ContainerDied","Data":"7581b53fb56a249eae64605281f1808fc3f743f474651d5f715e20891909fc78"} Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.375167 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sxpg2" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.375177 4717 scope.go:117] "RemoveContainer" containerID="f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.401574 4717 scope.go:117] "RemoveContainer" containerID="458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.414136 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpg2"] Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.420566 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.420600 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/27604d01-071c-4a2c-80cb-01de8401de57-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.420610 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4qf2\" (UniqueName: \"kubernetes.io/projected/27604d01-071c-4a2c-80cb-01de8401de57-kube-api-access-z4qf2\") on node \"crc\" DevicePath \"\"" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.422983 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sxpg2"] Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.441169 4717 scope.go:117] "RemoveContainer" containerID="3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.479635 4717 scope.go:117] "RemoveContainer" containerID="f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a" Oct 07 14:58:54 crc kubenswrapper[4717]: E1007 14:58:54.480572 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a\": container with ID starting with f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a not found: ID does not exist" containerID="f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.480614 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a"} err="failed to get container status \"f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a\": rpc error: code = NotFound desc = could not find container \"f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a\": container with ID starting with f039b0253c8736c84065ff600cbcb863b6dc5351d9d4ef2ff86e63edf00ae69a not found: ID does not exist" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.480647 4717 scope.go:117] "RemoveContainer" containerID="458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05" Oct 07 14:58:54 crc kubenswrapper[4717]: E1007 14:58:54.480953 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05\": container with ID starting with 458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05 not found: ID does not exist" containerID="458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.480981 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05"} err="failed to get container status \"458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05\": rpc error: code = NotFound desc = could not find container \"458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05\": container with ID starting with 458f1fb4e3a4693bda86344e21b08a25ed974224727ef8f47092e7d539e0ba05 not found: ID does not exist" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.481000 4717 scope.go:117] "RemoveContainer" containerID="3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45" Oct 07 14:58:54 crc kubenswrapper[4717]: E1007 14:58:54.481402 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45\": container with ID starting with 3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45 not found: ID does not exist" containerID="3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.481450 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45"} err="failed to get container status \"3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45\": rpc error: code = NotFound desc = could not find container \"3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45\": container with ID starting with 3427c4bf2a2d94e7290e4a88d8d112d6fde66799f3ac49050b2351286402cd45 not found: ID does not exist" Oct 07 14:58:54 crc kubenswrapper[4717]: I1007 14:58:54.886315 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27604d01-071c-4a2c-80cb-01de8401de57" path="/var/lib/kubelet/pods/27604d01-071c-4a2c-80cb-01de8401de57/volumes" Oct 07 14:59:03 crc kubenswrapper[4717]: I1007 14:59:03.104516 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8bjb" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="registry-server" probeResult="failure" output=< Oct 07 14:59:03 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Oct 07 14:59:03 crc kubenswrapper[4717]: > Oct 07 14:59:05 crc kubenswrapper[4717]: I1007 14:59:05.869785 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 14:59:06 crc kubenswrapper[4717]: I1007 14:59:06.485883 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"fa0cd6b77eff7aac17095efd33f5cbfd91708b673809efeef14d1a1138a99047"} Oct 07 14:59:13 crc kubenswrapper[4717]: I1007 14:59:13.106627 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n8bjb" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="registry-server" probeResult="failure" output=< Oct 07 14:59:13 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Oct 07 14:59:13 crc kubenswrapper[4717]: > Oct 07 14:59:22 crc kubenswrapper[4717]: I1007 14:59:22.287274 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:59:22 crc kubenswrapper[4717]: I1007 14:59:22.344273 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:59:22 crc kubenswrapper[4717]: I1007 14:59:22.528572 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8bjb"] Oct 07 14:59:23 crc kubenswrapper[4717]: I1007 14:59:23.646845 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n8bjb" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="registry-server" containerID="cri-o://01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa" gracePeriod=2 Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.333797 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.484297 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-catalog-content\") pod \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.485576 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nztlb\" (UniqueName: \"kubernetes.io/projected/cd343b81-bbc5-4a1e-b6df-c72433822e1e-kube-api-access-nztlb\") pod \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.485783 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-utilities\") pod \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\" (UID: \"cd343b81-bbc5-4a1e-b6df-c72433822e1e\") " Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.486419 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-utilities" (OuterVolumeSpecName: "utilities") pod "cd343b81-bbc5-4a1e-b6df-c72433822e1e" (UID: "cd343b81-bbc5-4a1e-b6df-c72433822e1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.486596 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.502158 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd343b81-bbc5-4a1e-b6df-c72433822e1e-kube-api-access-nztlb" (OuterVolumeSpecName: "kube-api-access-nztlb") pod "cd343b81-bbc5-4a1e-b6df-c72433822e1e" (UID: "cd343b81-bbc5-4a1e-b6df-c72433822e1e"). InnerVolumeSpecName "kube-api-access-nztlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.577941 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd343b81-bbc5-4a1e-b6df-c72433822e1e" (UID: "cd343b81-bbc5-4a1e-b6df-c72433822e1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.588505 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd343b81-bbc5-4a1e-b6df-c72433822e1e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.588539 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nztlb\" (UniqueName: \"kubernetes.io/projected/cd343b81-bbc5-4a1e-b6df-c72433822e1e-kube-api-access-nztlb\") on node \"crc\" DevicePath \"\"" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.657509 4717 generic.go:334] "Generic (PLEG): container finished" podID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerID="01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa" exitCode=0 Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.657549 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bjb" event={"ID":"cd343b81-bbc5-4a1e-b6df-c72433822e1e","Type":"ContainerDied","Data":"01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa"} Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.657582 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8bjb" event={"ID":"cd343b81-bbc5-4a1e-b6df-c72433822e1e","Type":"ContainerDied","Data":"e4db3b1268782041aa482f2f9d7baf1a83a1f8a9100d31b8c09705ed8126dffe"} Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.657603 4717 scope.go:117] "RemoveContainer" containerID="01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.657598 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8bjb" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.693145 4717 scope.go:117] "RemoveContainer" containerID="ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.701811 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8bjb"] Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.709491 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n8bjb"] Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.720549 4717 scope.go:117] "RemoveContainer" containerID="081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.774771 4717 scope.go:117] "RemoveContainer" containerID="01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa" Oct 07 14:59:24 crc kubenswrapper[4717]: E1007 14:59:24.775455 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa\": container with ID starting with 01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa not found: ID does not exist" containerID="01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.775508 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa"} err="failed to get container status \"01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa\": rpc error: code = NotFound desc = could not find container \"01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa\": container with ID starting with 01cdb8ff28169dc35ae45b2fc48cd422e1771a5de1e168fec26a5007abddbdfa not found: ID does not exist" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.775561 4717 scope.go:117] "RemoveContainer" containerID="ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649" Oct 07 14:59:24 crc kubenswrapper[4717]: E1007 14:59:24.775963 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649\": container with ID starting with ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649 not found: ID does not exist" containerID="ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.775996 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649"} err="failed to get container status \"ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649\": rpc error: code = NotFound desc = could not find container \"ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649\": container with ID starting with ae384d008266ec1609b803de2f52a7fd4567dff683909505a58128dd64fec649 not found: ID does not exist" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.776040 4717 scope.go:117] "RemoveContainer" containerID="081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb" Oct 07 14:59:24 crc kubenswrapper[4717]: E1007 14:59:24.776476 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb\": container with ID starting with 081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb not found: ID does not exist" containerID="081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.776545 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb"} err="failed to get container status \"081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb\": rpc error: code = NotFound desc = could not find container \"081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb\": container with ID starting with 081643452ffcb60ab13451a900eab5b75ef0fe2d5375f83b2e28fce502a60adb not found: ID does not exist" Oct 07 14:59:24 crc kubenswrapper[4717]: I1007 14:59:24.881122 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" path="/var/lib/kubelet/pods/cd343b81-bbc5-4a1e-b6df-c72433822e1e/volumes" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.182883 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7"] Oct 07 15:00:00 crc kubenswrapper[4717]: E1007 15:00:00.183979 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="extract-utilities" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.183993 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="extract-utilities" Oct 07 15:00:00 crc kubenswrapper[4717]: E1007 15:00:00.184001 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27604d01-071c-4a2c-80cb-01de8401de57" containerName="registry-server" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.184023 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="27604d01-071c-4a2c-80cb-01de8401de57" containerName="registry-server" Oct 07 15:00:00 crc kubenswrapper[4717]: E1007 15:00:00.184039 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="extract-content" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.184046 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="extract-content" Oct 07 15:00:00 crc kubenswrapper[4717]: E1007 15:00:00.184056 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27604d01-071c-4a2c-80cb-01de8401de57" containerName="extract-utilities" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.184062 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="27604d01-071c-4a2c-80cb-01de8401de57" containerName="extract-utilities" Oct 07 15:00:00 crc kubenswrapper[4717]: E1007 15:00:00.184080 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27604d01-071c-4a2c-80cb-01de8401de57" containerName="extract-content" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.184092 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="27604d01-071c-4a2c-80cb-01de8401de57" containerName="extract-content" Oct 07 15:00:00 crc kubenswrapper[4717]: E1007 15:00:00.184105 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="registry-server" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.184112 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="registry-server" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.184371 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="27604d01-071c-4a2c-80cb-01de8401de57" containerName="registry-server" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.184385 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd343b81-bbc5-4a1e-b6df-c72433822e1e" containerName="registry-server" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.185063 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.191757 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.192047 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.197261 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7"] Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.260208 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854b5259-044f-42d2-9e74-00c9eadcfec9-config-volume\") pod \"collect-profiles-29330820-6ktr7\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.260397 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq6q5\" (UniqueName: \"kubernetes.io/projected/854b5259-044f-42d2-9e74-00c9eadcfec9-kube-api-access-bq6q5\") pod \"collect-profiles-29330820-6ktr7\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.260441 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/854b5259-044f-42d2-9e74-00c9eadcfec9-secret-volume\") pod \"collect-profiles-29330820-6ktr7\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.363029 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854b5259-044f-42d2-9e74-00c9eadcfec9-config-volume\") pod \"collect-profiles-29330820-6ktr7\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.363172 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq6q5\" (UniqueName: \"kubernetes.io/projected/854b5259-044f-42d2-9e74-00c9eadcfec9-kube-api-access-bq6q5\") pod \"collect-profiles-29330820-6ktr7\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.363217 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/854b5259-044f-42d2-9e74-00c9eadcfec9-secret-volume\") pod \"collect-profiles-29330820-6ktr7\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.365405 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854b5259-044f-42d2-9e74-00c9eadcfec9-config-volume\") pod \"collect-profiles-29330820-6ktr7\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.372366 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/854b5259-044f-42d2-9e74-00c9eadcfec9-secret-volume\") pod \"collect-profiles-29330820-6ktr7\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.382762 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq6q5\" (UniqueName: \"kubernetes.io/projected/854b5259-044f-42d2-9e74-00c9eadcfec9-kube-api-access-bq6q5\") pod \"collect-profiles-29330820-6ktr7\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:00 crc kubenswrapper[4717]: I1007 15:00:00.525580 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:01 crc kubenswrapper[4717]: I1007 15:00:01.049658 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7"] Oct 07 15:00:01 crc kubenswrapper[4717]: W1007 15:00:01.075693 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854b5259_044f_42d2_9e74_00c9eadcfec9.slice/crio-6983371b518a172e082eafd229f3651a9c1a88e78bd3572ae7b37561516fdff4 WatchSource:0}: Error finding container 6983371b518a172e082eafd229f3651a9c1a88e78bd3572ae7b37561516fdff4: Status 404 returned error can't find the container with id 6983371b518a172e082eafd229f3651a9c1a88e78bd3572ae7b37561516fdff4 Oct 07 15:00:01 crc kubenswrapper[4717]: I1007 15:00:01.999395 4717 generic.go:334] "Generic (PLEG): container finished" podID="854b5259-044f-42d2-9e74-00c9eadcfec9" containerID="87c2e7ee1cd7f04c43fe8cf8d28d87747d16b311537b9482f460bf2f22fc737e" exitCode=0 Oct 07 15:00:01 crc kubenswrapper[4717]: I1007 15:00:01.999755 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" event={"ID":"854b5259-044f-42d2-9e74-00c9eadcfec9","Type":"ContainerDied","Data":"87c2e7ee1cd7f04c43fe8cf8d28d87747d16b311537b9482f460bf2f22fc737e"} Oct 07 15:00:01 crc kubenswrapper[4717]: I1007 15:00:01.999780 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" event={"ID":"854b5259-044f-42d2-9e74-00c9eadcfec9","Type":"ContainerStarted","Data":"6983371b518a172e082eafd229f3651a9c1a88e78bd3572ae7b37561516fdff4"} Oct 07 15:00:03 crc kubenswrapper[4717]: I1007 15:00:03.649796 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:03 crc kubenswrapper[4717]: I1007 15:00:03.751160 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq6q5\" (UniqueName: \"kubernetes.io/projected/854b5259-044f-42d2-9e74-00c9eadcfec9-kube-api-access-bq6q5\") pod \"854b5259-044f-42d2-9e74-00c9eadcfec9\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " Oct 07 15:00:03 crc kubenswrapper[4717]: I1007 15:00:03.751284 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854b5259-044f-42d2-9e74-00c9eadcfec9-config-volume\") pod \"854b5259-044f-42d2-9e74-00c9eadcfec9\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " Oct 07 15:00:03 crc kubenswrapper[4717]: I1007 15:00:03.751411 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/854b5259-044f-42d2-9e74-00c9eadcfec9-secret-volume\") pod \"854b5259-044f-42d2-9e74-00c9eadcfec9\" (UID: \"854b5259-044f-42d2-9e74-00c9eadcfec9\") " Oct 07 15:00:03 crc kubenswrapper[4717]: I1007 15:00:03.752340 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/854b5259-044f-42d2-9e74-00c9eadcfec9-config-volume" (OuterVolumeSpecName: "config-volume") pod "854b5259-044f-42d2-9e74-00c9eadcfec9" (UID: "854b5259-044f-42d2-9e74-00c9eadcfec9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:00:03 crc kubenswrapper[4717]: I1007 15:00:03.764231 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/854b5259-044f-42d2-9e74-00c9eadcfec9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "854b5259-044f-42d2-9e74-00c9eadcfec9" (UID: "854b5259-044f-42d2-9e74-00c9eadcfec9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:00:03 crc kubenswrapper[4717]: I1007 15:00:03.780602 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854b5259-044f-42d2-9e74-00c9eadcfec9-kube-api-access-bq6q5" (OuterVolumeSpecName: "kube-api-access-bq6q5") pod "854b5259-044f-42d2-9e74-00c9eadcfec9" (UID: "854b5259-044f-42d2-9e74-00c9eadcfec9"). InnerVolumeSpecName "kube-api-access-bq6q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:00:03 crc kubenswrapper[4717]: I1007 15:00:03.855293 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/854b5259-044f-42d2-9e74-00c9eadcfec9-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:03 crc kubenswrapper[4717]: I1007 15:00:03.855330 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/854b5259-044f-42d2-9e74-00c9eadcfec9-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:03 crc kubenswrapper[4717]: I1007 15:00:03.855346 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq6q5\" (UniqueName: \"kubernetes.io/projected/854b5259-044f-42d2-9e74-00c9eadcfec9-kube-api-access-bq6q5\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:04 crc kubenswrapper[4717]: I1007 15:00:04.032457 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" event={"ID":"854b5259-044f-42d2-9e74-00c9eadcfec9","Type":"ContainerDied","Data":"6983371b518a172e082eafd229f3651a9c1a88e78bd3572ae7b37561516fdff4"} Oct 07 15:00:04 crc kubenswrapper[4717]: I1007 15:00:04.032774 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6983371b518a172e082eafd229f3651a9c1a88e78bd3572ae7b37561516fdff4" Oct 07 15:00:04 crc kubenswrapper[4717]: I1007 15:00:04.032830 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330820-6ktr7" Oct 07 15:00:04 crc kubenswrapper[4717]: I1007 15:00:04.730025 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g"] Oct 07 15:00:04 crc kubenswrapper[4717]: I1007 15:00:04.740486 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330775-rh52g"] Oct 07 15:00:04 crc kubenswrapper[4717]: I1007 15:00:04.881127 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5" path="/var/lib/kubelet/pods/ec0de7ce-3011-4d1f-a8f8-01d77d81bbf5/volumes" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.378231 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6hxfl"] Oct 07 15:00:10 crc kubenswrapper[4717]: E1007 15:00:10.379421 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="854b5259-044f-42d2-9e74-00c9eadcfec9" containerName="collect-profiles" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.379638 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="854b5259-044f-42d2-9e74-00c9eadcfec9" containerName="collect-profiles" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.379939 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="854b5259-044f-42d2-9e74-00c9eadcfec9" containerName="collect-profiles" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.382018 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.418153 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6hxfl"] Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.497758 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be55c7be-af1c-44bc-ac2c-ac1db8fb4a82-catalog-content\") pod \"certified-operators-6hxfl\" (UID: \"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82\") " pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.497863 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be55c7be-af1c-44bc-ac2c-ac1db8fb4a82-utilities\") pod \"certified-operators-6hxfl\" (UID: \"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82\") " pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.498098 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxtc8\" (UniqueName: \"kubernetes.io/projected/be55c7be-af1c-44bc-ac2c-ac1db8fb4a82-kube-api-access-bxtc8\") pod \"certified-operators-6hxfl\" (UID: \"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82\") " pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.600737 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be55c7be-af1c-44bc-ac2c-ac1db8fb4a82-catalog-content\") pod \"certified-operators-6hxfl\" (UID: \"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82\") " pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.600830 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be55c7be-af1c-44bc-ac2c-ac1db8fb4a82-utilities\") pod \"certified-operators-6hxfl\" (UID: \"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82\") " pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.600860 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxtc8\" (UniqueName: \"kubernetes.io/projected/be55c7be-af1c-44bc-ac2c-ac1db8fb4a82-kube-api-access-bxtc8\") pod \"certified-operators-6hxfl\" (UID: \"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82\") " pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.601620 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be55c7be-af1c-44bc-ac2c-ac1db8fb4a82-catalog-content\") pod \"certified-operators-6hxfl\" (UID: \"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82\") " pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.601646 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be55c7be-af1c-44bc-ac2c-ac1db8fb4a82-utilities\") pod \"certified-operators-6hxfl\" (UID: \"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82\") " pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.622923 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxtc8\" (UniqueName: \"kubernetes.io/projected/be55c7be-af1c-44bc-ac2c-ac1db8fb4a82-kube-api-access-bxtc8\") pod \"certified-operators-6hxfl\" (UID: \"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82\") " pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:10 crc kubenswrapper[4717]: I1007 15:00:10.706231 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:11 crc kubenswrapper[4717]: I1007 15:00:11.296723 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6hxfl"] Oct 07 15:00:12 crc kubenswrapper[4717]: I1007 15:00:12.132318 4717 generic.go:334] "Generic (PLEG): container finished" podID="be55c7be-af1c-44bc-ac2c-ac1db8fb4a82" containerID="16b24b4be5ff8d2e46ddb4ebc03f2bb73b19b2577d322743a45ad7e074805640" exitCode=0 Oct 07 15:00:12 crc kubenswrapper[4717]: I1007 15:00:12.132463 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6hxfl" event={"ID":"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82","Type":"ContainerDied","Data":"16b24b4be5ff8d2e46ddb4ebc03f2bb73b19b2577d322743a45ad7e074805640"} Oct 07 15:00:12 crc kubenswrapper[4717]: I1007 15:00:12.132712 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6hxfl" event={"ID":"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82","Type":"ContainerStarted","Data":"6bcb795b1061282cc12a5fe2699a9a87ec8bf922139647c09269e916e547bea9"} Oct 07 15:00:18 crc kubenswrapper[4717]: I1007 15:00:18.195438 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6hxfl" event={"ID":"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82","Type":"ContainerStarted","Data":"755ea01224fb066d04c976f8cad376872c618107233b8561fec96e86bd60c421"} Oct 07 15:00:19 crc kubenswrapper[4717]: I1007 15:00:19.207495 4717 generic.go:334] "Generic (PLEG): container finished" podID="be55c7be-af1c-44bc-ac2c-ac1db8fb4a82" containerID="755ea01224fb066d04c976f8cad376872c618107233b8561fec96e86bd60c421" exitCode=0 Oct 07 15:00:19 crc kubenswrapper[4717]: I1007 15:00:19.207897 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6hxfl" event={"ID":"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82","Type":"ContainerDied","Data":"755ea01224fb066d04c976f8cad376872c618107233b8561fec96e86bd60c421"} Oct 07 15:00:20 crc kubenswrapper[4717]: I1007 15:00:20.226911 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6hxfl" event={"ID":"be55c7be-af1c-44bc-ac2c-ac1db8fb4a82","Type":"ContainerStarted","Data":"f91200379d4226a8a7e43bee93f54e344ec7f6aa44352654f44c791b3edbc92a"} Oct 07 15:00:20 crc kubenswrapper[4717]: I1007 15:00:20.269812 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6hxfl" podStartSLOduration=2.762303931 podStartE2EDuration="10.269783345s" podCreationTimestamp="2025-10-07 15:00:10 +0000 UTC" firstStartedPulling="2025-10-07 15:00:12.134350993 +0000 UTC m=+3993.962276785" lastFinishedPulling="2025-10-07 15:00:19.641830407 +0000 UTC m=+4001.469756199" observedRunningTime="2025-10-07 15:00:20.253695035 +0000 UTC m=+4002.081620827" watchObservedRunningTime="2025-10-07 15:00:20.269783345 +0000 UTC m=+4002.097709157" Oct 07 15:00:20 crc kubenswrapper[4717]: I1007 15:00:20.706511 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:20 crc kubenswrapper[4717]: I1007 15:00:20.706902 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:21 crc kubenswrapper[4717]: I1007 15:00:21.763778 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6hxfl" podUID="be55c7be-af1c-44bc-ac2c-ac1db8fb4a82" containerName="registry-server" probeResult="failure" output=< Oct 07 15:00:21 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Oct 07 15:00:21 crc kubenswrapper[4717]: > Oct 07 15:00:30 crc kubenswrapper[4717]: I1007 15:00:30.767660 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:30 crc kubenswrapper[4717]: I1007 15:00:30.822752 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6hxfl" Oct 07 15:00:30 crc kubenswrapper[4717]: I1007 15:00:30.937928 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6hxfl"] Oct 07 15:00:31 crc kubenswrapper[4717]: I1007 15:00:31.007962 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8zvs9"] Oct 07 15:00:31 crc kubenswrapper[4717]: I1007 15:00:31.008584 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8zvs9" podUID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerName="registry-server" containerID="cri-o://9dbb21310dbe464a795cead66c96456169fab09665a9aa35322c48a4c783af15" gracePeriod=2 Oct 07 15:00:31 crc kubenswrapper[4717]: I1007 15:00:31.369796 4717 generic.go:334] "Generic (PLEG): container finished" podID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerID="9dbb21310dbe464a795cead66c96456169fab09665a9aa35322c48a4c783af15" exitCode=0 Oct 07 15:00:31 crc kubenswrapper[4717]: I1007 15:00:31.369955 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zvs9" event={"ID":"b1fa580a-f78c-49a8-bd15-8d173925591a","Type":"ContainerDied","Data":"9dbb21310dbe464a795cead66c96456169fab09665a9aa35322c48a4c783af15"} Oct 07 15:00:31 crc kubenswrapper[4717]: I1007 15:00:31.862512 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.018890 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-utilities\") pod \"b1fa580a-f78c-49a8-bd15-8d173925591a\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.019528 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb9rs\" (UniqueName: \"kubernetes.io/projected/b1fa580a-f78c-49a8-bd15-8d173925591a-kube-api-access-xb9rs\") pod \"b1fa580a-f78c-49a8-bd15-8d173925591a\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.019910 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-catalog-content\") pod \"b1fa580a-f78c-49a8-bd15-8d173925591a\" (UID: \"b1fa580a-f78c-49a8-bd15-8d173925591a\") " Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.021502 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-utilities" (OuterVolumeSpecName: "utilities") pod "b1fa580a-f78c-49a8-bd15-8d173925591a" (UID: "b1fa580a-f78c-49a8-bd15-8d173925591a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.036296 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1fa580a-f78c-49a8-bd15-8d173925591a-kube-api-access-xb9rs" (OuterVolumeSpecName: "kube-api-access-xb9rs") pod "b1fa580a-f78c-49a8-bd15-8d173925591a" (UID: "b1fa580a-f78c-49a8-bd15-8d173925591a"). InnerVolumeSpecName "kube-api-access-xb9rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.122769 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.122804 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb9rs\" (UniqueName: \"kubernetes.io/projected/b1fa580a-f78c-49a8-bd15-8d173925591a-kube-api-access-xb9rs\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.371352 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1fa580a-f78c-49a8-bd15-8d173925591a" (UID: "b1fa580a-f78c-49a8-bd15-8d173925591a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.381093 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8zvs9" event={"ID":"b1fa580a-f78c-49a8-bd15-8d173925591a","Type":"ContainerDied","Data":"d69ebe0250989b5ebcfd5cece5bed9279eedf89ed5fd3d9477bd0ca456b07d55"} Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.381182 4717 scope.go:117] "RemoveContainer" containerID="9dbb21310dbe464a795cead66c96456169fab09665a9aa35322c48a4c783af15" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.381115 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8zvs9" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.428134 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1fa580a-f78c-49a8-bd15-8d173925591a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.440573 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8zvs9"] Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.454676 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8zvs9"] Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.510495 4717 scope.go:117] "RemoveContainer" containerID="c8a229e9b02ee80ad85d00a64eee03a1ba60aa17c2b68d8ffb39becb84e7460c" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.578129 4717 scope.go:117] "RemoveContainer" containerID="03c92bbd648cbdd8c13fa3ee08821e6de09238f9ff7212825d5b29dc8053072f" Oct 07 15:00:32 crc kubenswrapper[4717]: I1007 15:00:32.880354 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1fa580a-f78c-49a8-bd15-8d173925591a" path="/var/lib/kubelet/pods/b1fa580a-f78c-49a8-bd15-8d173925591a/volumes" Oct 07 15:00:45 crc kubenswrapper[4717]: I1007 15:00:45.569706 4717 scope.go:117] "RemoveContainer" containerID="f679a4ed24c1cb504b1e198617971169051b6ad035d9747c69dfdfc6ff3426f6" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.148822 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29330821-ncmw5"] Oct 07 15:01:00 crc kubenswrapper[4717]: E1007 15:01:00.151868 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerName="extract-utilities" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.151991 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerName="extract-utilities" Oct 07 15:01:00 crc kubenswrapper[4717]: E1007 15:01:00.152138 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerName="registry-server" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.152220 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerName="registry-server" Oct 07 15:01:00 crc kubenswrapper[4717]: E1007 15:01:00.152326 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerName="extract-content" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.152395 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerName="extract-content" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.152656 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1fa580a-f78c-49a8-bd15-8d173925591a" containerName="registry-server" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.153551 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.162834 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330821-ncmw5"] Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.191359 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-combined-ca-bundle\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.191731 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-config-data\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.191883 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmzkn\" (UniqueName: \"kubernetes.io/projected/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-kube-api-access-jmzkn\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.192115 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-fernet-keys\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.295271 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-config-data\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.295382 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmzkn\" (UniqueName: \"kubernetes.io/projected/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-kube-api-access-jmzkn\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.295427 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-fernet-keys\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.295557 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-combined-ca-bundle\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.304332 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-config-data\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.305314 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-combined-ca-bundle\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.312333 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-fernet-keys\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.313539 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmzkn\" (UniqueName: \"kubernetes.io/projected/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-kube-api-access-jmzkn\") pod \"keystone-cron-29330821-ncmw5\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.478065 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:00 crc kubenswrapper[4717]: I1007 15:01:00.976399 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29330821-ncmw5"] Oct 07 15:01:01 crc kubenswrapper[4717]: I1007 15:01:01.704452 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-ncmw5" event={"ID":"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d","Type":"ContainerStarted","Data":"ff199460998ee509d3320fdea26df2ae6ad85ea7260cd4e3f8633873fbce54e3"} Oct 07 15:01:01 crc kubenswrapper[4717]: I1007 15:01:01.706579 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-ncmw5" event={"ID":"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d","Type":"ContainerStarted","Data":"901279fec096c986847a9a797c13b4e6ffbb7bd9ecc5b8505433a548cdbc16ee"} Oct 07 15:01:01 crc kubenswrapper[4717]: I1007 15:01:01.721668 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29330821-ncmw5" podStartSLOduration=1.721650941 podStartE2EDuration="1.721650941s" podCreationTimestamp="2025-10-07 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:01:01.721624321 +0000 UTC m=+4043.549550123" watchObservedRunningTime="2025-10-07 15:01:01.721650941 +0000 UTC m=+4043.549576733" Oct 07 15:01:05 crc kubenswrapper[4717]: I1007 15:01:05.740396 4717 generic.go:334] "Generic (PLEG): container finished" podID="6cd311ec-9f03-47a4-8e27-5e4553bf4c1d" containerID="ff199460998ee509d3320fdea26df2ae6ad85ea7260cd4e3f8633873fbce54e3" exitCode=0 Oct 07 15:01:05 crc kubenswrapper[4717]: I1007 15:01:05.740476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-ncmw5" event={"ID":"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d","Type":"ContainerDied","Data":"ff199460998ee509d3320fdea26df2ae6ad85ea7260cd4e3f8633873fbce54e3"} Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.405854 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.558922 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-combined-ca-bundle\") pod \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.559346 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmzkn\" (UniqueName: \"kubernetes.io/projected/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-kube-api-access-jmzkn\") pod \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.559449 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-fernet-keys\") pod \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.559603 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-config-data\") pod \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\" (UID: \"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d\") " Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.566265 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-kube-api-access-jmzkn" (OuterVolumeSpecName: "kube-api-access-jmzkn") pod "6cd311ec-9f03-47a4-8e27-5e4553bf4c1d" (UID: "6cd311ec-9f03-47a4-8e27-5e4553bf4c1d"). InnerVolumeSpecName "kube-api-access-jmzkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.572858 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6cd311ec-9f03-47a4-8e27-5e4553bf4c1d" (UID: "6cd311ec-9f03-47a4-8e27-5e4553bf4c1d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.610182 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cd311ec-9f03-47a4-8e27-5e4553bf4c1d" (UID: "6cd311ec-9f03-47a4-8e27-5e4553bf4c1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.639882 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-config-data" (OuterVolumeSpecName: "config-data") pod "6cd311ec-9f03-47a4-8e27-5e4553bf4c1d" (UID: "6cd311ec-9f03-47a4-8e27-5e4553bf4c1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.661431 4717 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.661490 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmzkn\" (UniqueName: \"kubernetes.io/projected/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-kube-api-access-jmzkn\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.661503 4717 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.661511 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cd311ec-9f03-47a4-8e27-5e4553bf4c1d-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.760129 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29330821-ncmw5" event={"ID":"6cd311ec-9f03-47a4-8e27-5e4553bf4c1d","Type":"ContainerDied","Data":"901279fec096c986847a9a797c13b4e6ffbb7bd9ecc5b8505433a548cdbc16ee"} Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.760168 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="901279fec096c986847a9a797c13b4e6ffbb7bd9ecc5b8505433a548cdbc16ee" Oct 07 15:01:07 crc kubenswrapper[4717]: I1007 15:01:07.760218 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29330821-ncmw5" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.689731 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mxrbf"] Oct 07 15:01:22 crc kubenswrapper[4717]: E1007 15:01:22.691290 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd311ec-9f03-47a4-8e27-5e4553bf4c1d" containerName="keystone-cron" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.691310 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd311ec-9f03-47a4-8e27-5e4553bf4c1d" containerName="keystone-cron" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.691610 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd311ec-9f03-47a4-8e27-5e4553bf4c1d" containerName="keystone-cron" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.693430 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.707264 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxrbf"] Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.779459 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np5j6\" (UniqueName: \"kubernetes.io/projected/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-kube-api-access-np5j6\") pod \"community-operators-mxrbf\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.779549 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-utilities\") pod \"community-operators-mxrbf\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.779694 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-catalog-content\") pod \"community-operators-mxrbf\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.881281 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np5j6\" (UniqueName: \"kubernetes.io/projected/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-kube-api-access-np5j6\") pod \"community-operators-mxrbf\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.881332 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-utilities\") pod \"community-operators-mxrbf\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.881427 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-catalog-content\") pod \"community-operators-mxrbf\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.881955 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-catalog-content\") pod \"community-operators-mxrbf\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.882517 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-utilities\") pod \"community-operators-mxrbf\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:22 crc kubenswrapper[4717]: I1007 15:01:22.906165 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np5j6\" (UniqueName: \"kubernetes.io/projected/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-kube-api-access-np5j6\") pod \"community-operators-mxrbf\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:23 crc kubenswrapper[4717]: I1007 15:01:23.016322 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:23 crc kubenswrapper[4717]: I1007 15:01:23.659134 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mxrbf"] Oct 07 15:01:23 crc kubenswrapper[4717]: W1007 15:01:23.668320 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd69e3e1_2d43_4ffe_93a6_2964b70422f7.slice/crio-caf83f8e30193c8d0c9c0a1a747312efa3974e539df3c87b7cfbf1e06c1d7086 WatchSource:0}: Error finding container caf83f8e30193c8d0c9c0a1a747312efa3974e539df3c87b7cfbf1e06c1d7086: Status 404 returned error can't find the container with id caf83f8e30193c8d0c9c0a1a747312efa3974e539df3c87b7cfbf1e06c1d7086 Oct 07 15:01:23 crc kubenswrapper[4717]: I1007 15:01:23.925250 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrbf" event={"ID":"dd69e3e1-2d43-4ffe-93a6-2964b70422f7","Type":"ContainerStarted","Data":"caf83f8e30193c8d0c9c0a1a747312efa3974e539df3c87b7cfbf1e06c1d7086"} Oct 07 15:01:24 crc kubenswrapper[4717]: I1007 15:01:24.936887 4717 generic.go:334] "Generic (PLEG): container finished" podID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerID="997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525" exitCode=0 Oct 07 15:01:24 crc kubenswrapper[4717]: I1007 15:01:24.937035 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrbf" event={"ID":"dd69e3e1-2d43-4ffe-93a6-2964b70422f7","Type":"ContainerDied","Data":"997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525"} Oct 07 15:01:25 crc kubenswrapper[4717]: I1007 15:01:25.957661 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrbf" event={"ID":"dd69e3e1-2d43-4ffe-93a6-2964b70422f7","Type":"ContainerStarted","Data":"c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a"} Oct 07 15:01:27 crc kubenswrapper[4717]: I1007 15:01:27.977997 4717 generic.go:334] "Generic (PLEG): container finished" podID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerID="c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a" exitCode=0 Oct 07 15:01:27 crc kubenswrapper[4717]: I1007 15:01:27.978079 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrbf" event={"ID":"dd69e3e1-2d43-4ffe-93a6-2964b70422f7","Type":"ContainerDied","Data":"c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a"} Oct 07 15:01:28 crc kubenswrapper[4717]: I1007 15:01:28.988429 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrbf" event={"ID":"dd69e3e1-2d43-4ffe-93a6-2964b70422f7","Type":"ContainerStarted","Data":"667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17"} Oct 07 15:01:29 crc kubenswrapper[4717]: I1007 15:01:29.009784 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mxrbf" podStartSLOduration=3.217674678 podStartE2EDuration="7.009760013s" podCreationTimestamp="2025-10-07 15:01:22 +0000 UTC" firstStartedPulling="2025-10-07 15:01:24.939377249 +0000 UTC m=+4066.767303051" lastFinishedPulling="2025-10-07 15:01:28.731462604 +0000 UTC m=+4070.559388386" observedRunningTime="2025-10-07 15:01:29.008353615 +0000 UTC m=+4070.836279427" watchObservedRunningTime="2025-10-07 15:01:29.009760013 +0000 UTC m=+4070.837685815" Oct 07 15:01:31 crc kubenswrapper[4717]: I1007 15:01:31.609921 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:01:31 crc kubenswrapper[4717]: I1007 15:01:31.610629 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:01:33 crc kubenswrapper[4717]: I1007 15:01:33.016520 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:33 crc kubenswrapper[4717]: I1007 15:01:33.018004 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:33 crc kubenswrapper[4717]: I1007 15:01:33.071672 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:34 crc kubenswrapper[4717]: I1007 15:01:34.083792 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:34 crc kubenswrapper[4717]: I1007 15:01:34.140533 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxrbf"] Oct 07 15:01:36 crc kubenswrapper[4717]: I1007 15:01:36.048272 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mxrbf" podUID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerName="registry-server" containerID="cri-o://667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17" gracePeriod=2 Oct 07 15:01:36 crc kubenswrapper[4717]: I1007 15:01:36.830511 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:36 crc kubenswrapper[4717]: I1007 15:01:36.898288 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-catalog-content\") pod \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " Oct 07 15:01:36 crc kubenswrapper[4717]: I1007 15:01:36.898848 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np5j6\" (UniqueName: \"kubernetes.io/projected/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-kube-api-access-np5j6\") pod \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " Oct 07 15:01:36 crc kubenswrapper[4717]: I1007 15:01:36.898914 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-utilities\") pod \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\" (UID: \"dd69e3e1-2d43-4ffe-93a6-2964b70422f7\") " Oct 07 15:01:36 crc kubenswrapper[4717]: I1007 15:01:36.905925 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-utilities" (OuterVolumeSpecName: "utilities") pod "dd69e3e1-2d43-4ffe-93a6-2964b70422f7" (UID: "dd69e3e1-2d43-4ffe-93a6-2964b70422f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:01:36 crc kubenswrapper[4717]: I1007 15:01:36.916556 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-kube-api-access-np5j6" (OuterVolumeSpecName: "kube-api-access-np5j6") pod "dd69e3e1-2d43-4ffe-93a6-2964b70422f7" (UID: "dd69e3e1-2d43-4ffe-93a6-2964b70422f7"). InnerVolumeSpecName "kube-api-access-np5j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:01:36 crc kubenswrapper[4717]: I1007 15:01:36.964385 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd69e3e1-2d43-4ffe-93a6-2964b70422f7" (UID: "dd69e3e1-2d43-4ffe-93a6-2964b70422f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.001393 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np5j6\" (UniqueName: \"kubernetes.io/projected/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-kube-api-access-np5j6\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.001430 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.001440 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd69e3e1-2d43-4ffe-93a6-2964b70422f7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.064501 4717 generic.go:334] "Generic (PLEG): container finished" podID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerID="667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17" exitCode=0 Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.064663 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrbf" event={"ID":"dd69e3e1-2d43-4ffe-93a6-2964b70422f7","Type":"ContainerDied","Data":"667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17"} Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.064718 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mxrbf" event={"ID":"dd69e3e1-2d43-4ffe-93a6-2964b70422f7","Type":"ContainerDied","Data":"caf83f8e30193c8d0c9c0a1a747312efa3974e539df3c87b7cfbf1e06c1d7086"} Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.064750 4717 scope.go:117] "RemoveContainer" containerID="667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.064660 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mxrbf" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.092905 4717 scope.go:117] "RemoveContainer" containerID="c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.116487 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mxrbf"] Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.130875 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mxrbf"] Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.134864 4717 scope.go:117] "RemoveContainer" containerID="997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.176443 4717 scope.go:117] "RemoveContainer" containerID="667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17" Oct 07 15:01:37 crc kubenswrapper[4717]: E1007 15:01:37.177647 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17\": container with ID starting with 667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17 not found: ID does not exist" containerID="667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.177690 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17"} err="failed to get container status \"667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17\": rpc error: code = NotFound desc = could not find container \"667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17\": container with ID starting with 667ff378a0d9220e1c63fa05fad66cfdbfafcef3c1e052d1dba93354dc3f5b17 not found: ID does not exist" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.177718 4717 scope.go:117] "RemoveContainer" containerID="c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a" Oct 07 15:01:37 crc kubenswrapper[4717]: E1007 15:01:37.178025 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a\": container with ID starting with c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a not found: ID does not exist" containerID="c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.178056 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a"} err="failed to get container status \"c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a\": rpc error: code = NotFound desc = could not find container \"c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a\": container with ID starting with c32397332044c9c03c153de0ea6233f77154749a95989867753fa6890954f63a not found: ID does not exist" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.178077 4717 scope.go:117] "RemoveContainer" containerID="997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525" Oct 07 15:01:37 crc kubenswrapper[4717]: E1007 15:01:37.178613 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525\": container with ID starting with 997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525 not found: ID does not exist" containerID="997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525" Oct 07 15:01:37 crc kubenswrapper[4717]: I1007 15:01:37.178640 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525"} err="failed to get container status \"997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525\": rpc error: code = NotFound desc = could not find container \"997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525\": container with ID starting with 997f58ca0c9965fe5e5108d1a80043a63e98c334f02d6cb6fd533b3688c23525 not found: ID does not exist" Oct 07 15:01:38 crc kubenswrapper[4717]: I1007 15:01:38.882357 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" path="/var/lib/kubelet/pods/dd69e3e1-2d43-4ffe-93a6-2964b70422f7/volumes" Oct 07 15:02:01 crc kubenswrapper[4717]: I1007 15:02:01.610441 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:02:01 crc kubenswrapper[4717]: I1007 15:02:01.611148 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:02:31 crc kubenswrapper[4717]: I1007 15:02:31.609894 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:02:31 crc kubenswrapper[4717]: I1007 15:02:31.610613 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:02:31 crc kubenswrapper[4717]: I1007 15:02:31.610687 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 15:02:31 crc kubenswrapper[4717]: I1007 15:02:31.611459 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa0cd6b77eff7aac17095efd33f5cbfd91708b673809efeef14d1a1138a99047"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:02:31 crc kubenswrapper[4717]: I1007 15:02:31.611512 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://fa0cd6b77eff7aac17095efd33f5cbfd91708b673809efeef14d1a1138a99047" gracePeriod=600 Oct 07 15:02:32 crc kubenswrapper[4717]: I1007 15:02:32.585451 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="fa0cd6b77eff7aac17095efd33f5cbfd91708b673809efeef14d1a1138a99047" exitCode=0 Oct 07 15:02:32 crc kubenswrapper[4717]: I1007 15:02:32.585544 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"fa0cd6b77eff7aac17095efd33f5cbfd91708b673809efeef14d1a1138a99047"} Oct 07 15:02:32 crc kubenswrapper[4717]: I1007 15:02:32.586181 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25"} Oct 07 15:02:32 crc kubenswrapper[4717]: I1007 15:02:32.586203 4717 scope.go:117] "RemoveContainer" containerID="01fe0f9fcd088face88ae81039733e8ef52529ed8c6ae7e443638f5048239aaf" Oct 07 15:04:31 crc kubenswrapper[4717]: I1007 15:04:31.610228 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:04:31 crc kubenswrapper[4717]: I1007 15:04:31.610770 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:05:01 crc kubenswrapper[4717]: I1007 15:05:01.610099 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:05:01 crc kubenswrapper[4717]: I1007 15:05:01.610671 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:05:31 crc kubenswrapper[4717]: I1007 15:05:31.609645 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:05:31 crc kubenswrapper[4717]: I1007 15:05:31.610203 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:05:31 crc kubenswrapper[4717]: I1007 15:05:31.610247 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 15:05:31 crc kubenswrapper[4717]: I1007 15:05:31.611111 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:05:31 crc kubenswrapper[4717]: I1007 15:05:31.611166 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" gracePeriod=600 Oct 07 15:05:31 crc kubenswrapper[4717]: E1007 15:05:31.746512 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:05:32 crc kubenswrapper[4717]: I1007 15:05:32.199471 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" exitCode=0 Oct 07 15:05:32 crc kubenswrapper[4717]: I1007 15:05:32.199557 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25"} Oct 07 15:05:32 crc kubenswrapper[4717]: I1007 15:05:32.199795 4717 scope.go:117] "RemoveContainer" containerID="fa0cd6b77eff7aac17095efd33f5cbfd91708b673809efeef14d1a1138a99047" Oct 07 15:05:32 crc kubenswrapper[4717]: I1007 15:05:32.200498 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:05:32 crc kubenswrapper[4717]: E1007 15:05:32.200862 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:05:45 crc kubenswrapper[4717]: I1007 15:05:45.870444 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:05:45 crc kubenswrapper[4717]: E1007 15:05:45.871334 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:06:00 crc kubenswrapper[4717]: I1007 15:06:00.869074 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:06:00 crc kubenswrapper[4717]: E1007 15:06:00.869781 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:06:15 crc kubenswrapper[4717]: I1007 15:06:15.868239 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:06:15 crc kubenswrapper[4717]: E1007 15:06:15.869984 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:06:29 crc kubenswrapper[4717]: I1007 15:06:29.868992 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:06:29 crc kubenswrapper[4717]: E1007 15:06:29.869926 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:06:42 crc kubenswrapper[4717]: I1007 15:06:42.870999 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:06:42 crc kubenswrapper[4717]: E1007 15:06:42.872289 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:06:54 crc kubenswrapper[4717]: I1007 15:06:54.868088 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:06:54 crc kubenswrapper[4717]: E1007 15:06:54.868933 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:07:08 crc kubenswrapper[4717]: I1007 15:07:08.875445 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:07:08 crc kubenswrapper[4717]: E1007 15:07:08.876398 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:07:21 crc kubenswrapper[4717]: I1007 15:07:21.870651 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:07:21 crc kubenswrapper[4717]: E1007 15:07:21.872649 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:07:33 crc kubenswrapper[4717]: I1007 15:07:33.869963 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:07:33 crc kubenswrapper[4717]: E1007 15:07:33.871341 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:07:47 crc kubenswrapper[4717]: I1007 15:07:47.868844 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:07:47 crc kubenswrapper[4717]: E1007 15:07:47.869796 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:07:59 crc kubenswrapper[4717]: I1007 15:07:59.868613 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:07:59 crc kubenswrapper[4717]: E1007 15:07:59.869412 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:08:13 crc kubenswrapper[4717]: I1007 15:08:13.868878 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:08:13 crc kubenswrapper[4717]: E1007 15:08:13.870323 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:08:28 crc kubenswrapper[4717]: I1007 15:08:28.875398 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:08:28 crc kubenswrapper[4717]: E1007 15:08:28.876152 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:08:42 crc kubenswrapper[4717]: I1007 15:08:42.869204 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:08:42 crc kubenswrapper[4717]: E1007 15:08:42.870200 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:08:53 crc kubenswrapper[4717]: I1007 15:08:53.869139 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:08:53 crc kubenswrapper[4717]: E1007 15:08:53.870261 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:09:08 crc kubenswrapper[4717]: I1007 15:09:08.875952 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:09:08 crc kubenswrapper[4717]: E1007 15:09:08.876798 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:09:20 crc kubenswrapper[4717]: I1007 15:09:20.869593 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:09:20 crc kubenswrapper[4717]: E1007 15:09:20.870350 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:09:35 crc kubenswrapper[4717]: I1007 15:09:35.868393 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:09:35 crc kubenswrapper[4717]: E1007 15:09:35.869124 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.049880 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xfn5b"] Oct 07 15:09:44 crc kubenswrapper[4717]: E1007 15:09:44.050930 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerName="extract-utilities" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.050951 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerName="extract-utilities" Oct 07 15:09:44 crc kubenswrapper[4717]: E1007 15:09:44.050973 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerName="extract-content" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.050984 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerName="extract-content" Oct 07 15:09:44 crc kubenswrapper[4717]: E1007 15:09:44.051043 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerName="registry-server" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.051053 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerName="registry-server" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.051269 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd69e3e1-2d43-4ffe-93a6-2964b70422f7" containerName="registry-server" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.055279 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.067759 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xfn5b"] Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.134769 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppwbl\" (UniqueName: \"kubernetes.io/projected/6c23c89b-1e47-491e-b6af-3d143b578acb-kube-api-access-ppwbl\") pod \"redhat-operators-xfn5b\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.135097 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-utilities\") pod \"redhat-operators-xfn5b\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.135186 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-catalog-content\") pod \"redhat-operators-xfn5b\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.236653 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-utilities\") pod \"redhat-operators-xfn5b\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.236747 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-catalog-content\") pod \"redhat-operators-xfn5b\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.236767 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppwbl\" (UniqueName: \"kubernetes.io/projected/6c23c89b-1e47-491e-b6af-3d143b578acb-kube-api-access-ppwbl\") pod \"redhat-operators-xfn5b\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.237673 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-utilities\") pod \"redhat-operators-xfn5b\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.237956 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-catalog-content\") pod \"redhat-operators-xfn5b\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.258752 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppwbl\" (UniqueName: \"kubernetes.io/projected/6c23c89b-1e47-491e-b6af-3d143b578acb-kube-api-access-ppwbl\") pod \"redhat-operators-xfn5b\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:44 crc kubenswrapper[4717]: I1007 15:09:44.391138 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:09:45 crc kubenswrapper[4717]: I1007 15:09:45.022070 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xfn5b"] Oct 07 15:09:45 crc kubenswrapper[4717]: I1007 15:09:45.499596 4717 generic.go:334] "Generic (PLEG): container finished" podID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerID="9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8" exitCode=0 Oct 07 15:09:45 crc kubenswrapper[4717]: I1007 15:09:45.499658 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfn5b" event={"ID":"6c23c89b-1e47-491e-b6af-3d143b578acb","Type":"ContainerDied","Data":"9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8"} Oct 07 15:09:45 crc kubenswrapper[4717]: I1007 15:09:45.500172 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfn5b" event={"ID":"6c23c89b-1e47-491e-b6af-3d143b578acb","Type":"ContainerStarted","Data":"7d59ec47585a4b14d7991acddabdee6cf2e930fb9c8d9166d89909a98c57d809"} Oct 07 15:09:45 crc kubenswrapper[4717]: I1007 15:09:45.502439 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:09:47 crc kubenswrapper[4717]: I1007 15:09:47.528511 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfn5b" event={"ID":"6c23c89b-1e47-491e-b6af-3d143b578acb","Type":"ContainerStarted","Data":"a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3"} Oct 07 15:09:47 crc kubenswrapper[4717]: I1007 15:09:47.869287 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:09:47 crc kubenswrapper[4717]: E1007 15:09:47.869574 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.130789 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n487b"] Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.135576 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.149393 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n487b"] Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.216041 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrc9q\" (UniqueName: \"kubernetes.io/projected/159ac7d8-7aa7-4c22-8781-e87bf7b35651-kube-api-access-xrc9q\") pod \"redhat-marketplace-n487b\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.216545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-catalog-content\") pod \"redhat-marketplace-n487b\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.216657 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-utilities\") pod \"redhat-marketplace-n487b\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.318358 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-catalog-content\") pod \"redhat-marketplace-n487b\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.318411 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-utilities\") pod \"redhat-marketplace-n487b\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.318469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrc9q\" (UniqueName: \"kubernetes.io/projected/159ac7d8-7aa7-4c22-8781-e87bf7b35651-kube-api-access-xrc9q\") pod \"redhat-marketplace-n487b\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.319037 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-catalog-content\") pod \"redhat-marketplace-n487b\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.319214 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-utilities\") pod \"redhat-marketplace-n487b\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.344285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrc9q\" (UniqueName: \"kubernetes.io/projected/159ac7d8-7aa7-4c22-8781-e87bf7b35651-kube-api-access-xrc9q\") pod \"redhat-marketplace-n487b\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:52 crc kubenswrapper[4717]: I1007 15:09:52.460498 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:09:53 crc kubenswrapper[4717]: I1007 15:09:53.002870 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n487b"] Oct 07 15:09:53 crc kubenswrapper[4717]: I1007 15:09:53.591944 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n487b" event={"ID":"159ac7d8-7aa7-4c22-8781-e87bf7b35651","Type":"ContainerStarted","Data":"321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98"} Oct 07 15:09:53 crc kubenswrapper[4717]: I1007 15:09:53.592249 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n487b" event={"ID":"159ac7d8-7aa7-4c22-8781-e87bf7b35651","Type":"ContainerStarted","Data":"5667ae891ecb47e0e61f219f94b67846493a4e817b162b3574b50855045483f2"} Oct 07 15:09:54 crc kubenswrapper[4717]: I1007 15:09:54.604945 4717 generic.go:334] "Generic (PLEG): container finished" podID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerID="321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98" exitCode=0 Oct 07 15:09:54 crc kubenswrapper[4717]: I1007 15:09:54.605039 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n487b" event={"ID":"159ac7d8-7aa7-4c22-8781-e87bf7b35651","Type":"ContainerDied","Data":"321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98"} Oct 07 15:09:55 crc kubenswrapper[4717]: I1007 15:09:55.617082 4717 generic.go:334] "Generic (PLEG): container finished" podID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerID="a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3" exitCode=0 Oct 07 15:09:55 crc kubenswrapper[4717]: I1007 15:09:55.617377 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfn5b" event={"ID":"6c23c89b-1e47-491e-b6af-3d143b578acb","Type":"ContainerDied","Data":"a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3"} Oct 07 15:09:56 crc kubenswrapper[4717]: I1007 15:09:56.628993 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n487b" event={"ID":"159ac7d8-7aa7-4c22-8781-e87bf7b35651","Type":"ContainerStarted","Data":"f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0"} Oct 07 15:09:56 crc kubenswrapper[4717]: I1007 15:09:56.633132 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfn5b" event={"ID":"6c23c89b-1e47-491e-b6af-3d143b578acb","Type":"ContainerStarted","Data":"b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e"} Oct 07 15:09:56 crc kubenswrapper[4717]: I1007 15:09:56.668380 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xfn5b" podStartSLOduration=1.903322848 podStartE2EDuration="12.668359643s" podCreationTimestamp="2025-10-07 15:09:44 +0000 UTC" firstStartedPulling="2025-10-07 15:09:45.502167078 +0000 UTC m=+4567.330092870" lastFinishedPulling="2025-10-07 15:09:56.267203873 +0000 UTC m=+4578.095129665" observedRunningTime="2025-10-07 15:09:56.661538506 +0000 UTC m=+4578.489464308" watchObservedRunningTime="2025-10-07 15:09:56.668359643 +0000 UTC m=+4578.496285435" Oct 07 15:09:57 crc kubenswrapper[4717]: I1007 15:09:57.643937 4717 generic.go:334] "Generic (PLEG): container finished" podID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerID="f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0" exitCode=0 Oct 07 15:09:57 crc kubenswrapper[4717]: I1007 15:09:57.644029 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n487b" event={"ID":"159ac7d8-7aa7-4c22-8781-e87bf7b35651","Type":"ContainerDied","Data":"f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0"} Oct 07 15:09:58 crc kubenswrapper[4717]: I1007 15:09:58.656271 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n487b" event={"ID":"159ac7d8-7aa7-4c22-8781-e87bf7b35651","Type":"ContainerStarted","Data":"9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707"} Oct 07 15:09:58 crc kubenswrapper[4717]: I1007 15:09:58.677950 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n487b" podStartSLOduration=3.038272777 podStartE2EDuration="6.677930863s" podCreationTimestamp="2025-10-07 15:09:52 +0000 UTC" firstStartedPulling="2025-10-07 15:09:54.614949742 +0000 UTC m=+4576.442875534" lastFinishedPulling="2025-10-07 15:09:58.254607828 +0000 UTC m=+4580.082533620" observedRunningTime="2025-10-07 15:09:58.67415298 +0000 UTC m=+4580.502078772" watchObservedRunningTime="2025-10-07 15:09:58.677930863 +0000 UTC m=+4580.505856655" Oct 07 15:10:02 crc kubenswrapper[4717]: I1007 15:10:02.460984 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:10:02 crc kubenswrapper[4717]: I1007 15:10:02.461682 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:10:02 crc kubenswrapper[4717]: I1007 15:10:02.868531 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:10:02 crc kubenswrapper[4717]: E1007 15:10:02.868819 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:10:02 crc kubenswrapper[4717]: I1007 15:10:02.980209 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:10:03 crc kubenswrapper[4717]: I1007 15:10:03.753970 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:10:03 crc kubenswrapper[4717]: I1007 15:10:03.815138 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n487b"] Oct 07 15:10:04 crc kubenswrapper[4717]: I1007 15:10:04.392422 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:10:04 crc kubenswrapper[4717]: I1007 15:10:04.392775 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:10:05 crc kubenswrapper[4717]: I1007 15:10:05.726480 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n487b" podUID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerName="registry-server" containerID="cri-o://9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707" gracePeriod=2 Oct 07 15:10:05 crc kubenswrapper[4717]: I1007 15:10:05.886085 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xfn5b" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerName="registry-server" probeResult="failure" output=< Oct 07 15:10:05 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Oct 07 15:10:05 crc kubenswrapper[4717]: > Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.296444 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.416317 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-utilities\") pod \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.416621 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-catalog-content\") pod \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.416723 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrc9q\" (UniqueName: \"kubernetes.io/projected/159ac7d8-7aa7-4c22-8781-e87bf7b35651-kube-api-access-xrc9q\") pod \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\" (UID: \"159ac7d8-7aa7-4c22-8781-e87bf7b35651\") " Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.417309 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-utilities" (OuterVolumeSpecName: "utilities") pod "159ac7d8-7aa7-4c22-8781-e87bf7b35651" (UID: "159ac7d8-7aa7-4c22-8781-e87bf7b35651"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.423685 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/159ac7d8-7aa7-4c22-8781-e87bf7b35651-kube-api-access-xrc9q" (OuterVolumeSpecName: "kube-api-access-xrc9q") pod "159ac7d8-7aa7-4c22-8781-e87bf7b35651" (UID: "159ac7d8-7aa7-4c22-8781-e87bf7b35651"). InnerVolumeSpecName "kube-api-access-xrc9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.430451 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "159ac7d8-7aa7-4c22-8781-e87bf7b35651" (UID: "159ac7d8-7aa7-4c22-8781-e87bf7b35651"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.519477 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.519713 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/159ac7d8-7aa7-4c22-8781-e87bf7b35651-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.519773 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrc9q\" (UniqueName: \"kubernetes.io/projected/159ac7d8-7aa7-4c22-8781-e87bf7b35651-kube-api-access-xrc9q\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.736596 4717 generic.go:334] "Generic (PLEG): container finished" podID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerID="9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707" exitCode=0 Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.736655 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n487b" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.736675 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n487b" event={"ID":"159ac7d8-7aa7-4c22-8781-e87bf7b35651","Type":"ContainerDied","Data":"9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707"} Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.737965 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n487b" event={"ID":"159ac7d8-7aa7-4c22-8781-e87bf7b35651","Type":"ContainerDied","Data":"5667ae891ecb47e0e61f219f94b67846493a4e817b162b3574b50855045483f2"} Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.737985 4717 scope.go:117] "RemoveContainer" containerID="9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.761259 4717 scope.go:117] "RemoveContainer" containerID="f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.789093 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n487b"] Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.799205 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n487b"] Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.810407 4717 scope.go:117] "RemoveContainer" containerID="321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.865926 4717 scope.go:117] "RemoveContainer" containerID="9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707" Oct 07 15:10:06 crc kubenswrapper[4717]: E1007 15:10:06.866435 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707\": container with ID starting with 9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707 not found: ID does not exist" containerID="9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.866540 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707"} err="failed to get container status \"9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707\": rpc error: code = NotFound desc = could not find container \"9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707\": container with ID starting with 9ddca1afb1d40f7813a5ca22e0c7257b16e9803f4ba77447c7db359e92b7e707 not found: ID does not exist" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.866750 4717 scope.go:117] "RemoveContainer" containerID="f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0" Oct 07 15:10:06 crc kubenswrapper[4717]: E1007 15:10:06.867090 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0\": container with ID starting with f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0 not found: ID does not exist" containerID="f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.867190 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0"} err="failed to get container status \"f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0\": rpc error: code = NotFound desc = could not find container \"f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0\": container with ID starting with f3029b14b37d7dcf67a91c2055741dbca318f925b7af069c00e29a47144262a0 not found: ID does not exist" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.867273 4717 scope.go:117] "RemoveContainer" containerID="321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98" Oct 07 15:10:06 crc kubenswrapper[4717]: E1007 15:10:06.867833 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98\": container with ID starting with 321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98 not found: ID does not exist" containerID="321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.871267 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98"} err="failed to get container status \"321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98\": rpc error: code = NotFound desc = could not find container \"321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98\": container with ID starting with 321a80364d4d90ebfad9f103bfc601bf22e9b95b113468384600e463c64c9b98 not found: ID does not exist" Oct 07 15:10:06 crc kubenswrapper[4717]: I1007 15:10:06.883115 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" path="/var/lib/kubelet/pods/159ac7d8-7aa7-4c22-8781-e87bf7b35651/volumes" Oct 07 15:10:15 crc kubenswrapper[4717]: I1007 15:10:15.437178 4717 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xfn5b" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerName="registry-server" probeResult="failure" output=< Oct 07 15:10:15 crc kubenswrapper[4717]: timeout: failed to connect service ":50051" within 1s Oct 07 15:10:15 crc kubenswrapper[4717]: > Oct 07 15:10:17 crc kubenswrapper[4717]: I1007 15:10:17.871454 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:10:17 crc kubenswrapper[4717]: E1007 15:10:17.872239 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:10:24 crc kubenswrapper[4717]: I1007 15:10:24.441471 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:10:24 crc kubenswrapper[4717]: I1007 15:10:24.497016 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:10:24 crc kubenswrapper[4717]: I1007 15:10:24.685472 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xfn5b"] Oct 07 15:10:25 crc kubenswrapper[4717]: I1007 15:10:25.911541 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xfn5b" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerName="registry-server" containerID="cri-o://b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e" gracePeriod=2 Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.523718 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.672049 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-catalog-content\") pod \"6c23c89b-1e47-491e-b6af-3d143b578acb\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.672181 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppwbl\" (UniqueName: \"kubernetes.io/projected/6c23c89b-1e47-491e-b6af-3d143b578acb-kube-api-access-ppwbl\") pod \"6c23c89b-1e47-491e-b6af-3d143b578acb\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.672219 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-utilities\") pod \"6c23c89b-1e47-491e-b6af-3d143b578acb\" (UID: \"6c23c89b-1e47-491e-b6af-3d143b578acb\") " Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.673187 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-utilities" (OuterVolumeSpecName: "utilities") pod "6c23c89b-1e47-491e-b6af-3d143b578acb" (UID: "6c23c89b-1e47-491e-b6af-3d143b578acb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.684317 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c23c89b-1e47-491e-b6af-3d143b578acb-kube-api-access-ppwbl" (OuterVolumeSpecName: "kube-api-access-ppwbl") pod "6c23c89b-1e47-491e-b6af-3d143b578acb" (UID: "6c23c89b-1e47-491e-b6af-3d143b578acb"). InnerVolumeSpecName "kube-api-access-ppwbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.762098 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c23c89b-1e47-491e-b6af-3d143b578acb" (UID: "6c23c89b-1e47-491e-b6af-3d143b578acb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.774997 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.775054 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppwbl\" (UniqueName: \"kubernetes.io/projected/6c23c89b-1e47-491e-b6af-3d143b578acb-kube-api-access-ppwbl\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.775074 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c23c89b-1e47-491e-b6af-3d143b578acb-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.934607 4717 generic.go:334] "Generic (PLEG): container finished" podID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerID="b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e" exitCode=0 Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.934694 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfn5b" event={"ID":"6c23c89b-1e47-491e-b6af-3d143b578acb","Type":"ContainerDied","Data":"b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e"} Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.934730 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xfn5b" Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.934770 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xfn5b" event={"ID":"6c23c89b-1e47-491e-b6af-3d143b578acb","Type":"ContainerDied","Data":"7d59ec47585a4b14d7991acddabdee6cf2e930fb9c8d9166d89909a98c57d809"} Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.934807 4717 scope.go:117] "RemoveContainer" containerID="b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e" Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.959318 4717 scope.go:117] "RemoveContainer" containerID="a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3" Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.973210 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xfn5b"] Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.982132 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xfn5b"] Oct 07 15:10:26 crc kubenswrapper[4717]: I1007 15:10:26.997378 4717 scope.go:117] "RemoveContainer" containerID="9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8" Oct 07 15:10:27 crc kubenswrapper[4717]: I1007 15:10:27.045918 4717 scope.go:117] "RemoveContainer" containerID="b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e" Oct 07 15:10:27 crc kubenswrapper[4717]: E1007 15:10:27.048529 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e\": container with ID starting with b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e not found: ID does not exist" containerID="b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e" Oct 07 15:10:27 crc kubenswrapper[4717]: I1007 15:10:27.048603 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e"} err="failed to get container status \"b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e\": rpc error: code = NotFound desc = could not find container \"b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e\": container with ID starting with b15ab9f3a314ebd2c5507a26016cebda702df2f18509ab9650b1ec4c32d9340e not found: ID does not exist" Oct 07 15:10:27 crc kubenswrapper[4717]: I1007 15:10:27.048644 4717 scope.go:117] "RemoveContainer" containerID="a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3" Oct 07 15:10:27 crc kubenswrapper[4717]: E1007 15:10:27.048997 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3\": container with ID starting with a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3 not found: ID does not exist" containerID="a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3" Oct 07 15:10:27 crc kubenswrapper[4717]: I1007 15:10:27.049051 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3"} err="failed to get container status \"a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3\": rpc error: code = NotFound desc = could not find container \"a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3\": container with ID starting with a5298daab0753ba6ed3ffb3bb0334216d7213170164ea9860032f0dbcd0a90a3 not found: ID does not exist" Oct 07 15:10:27 crc kubenswrapper[4717]: I1007 15:10:27.049074 4717 scope.go:117] "RemoveContainer" containerID="9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8" Oct 07 15:10:27 crc kubenswrapper[4717]: E1007 15:10:27.049465 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8\": container with ID starting with 9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8 not found: ID does not exist" containerID="9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8" Oct 07 15:10:27 crc kubenswrapper[4717]: I1007 15:10:27.049528 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8"} err="failed to get container status \"9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8\": rpc error: code = NotFound desc = could not find container \"9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8\": container with ID starting with 9b54f447c91a0985045592a33164e8856ec5593058aff01b01295321402468b8 not found: ID does not exist" Oct 07 15:10:28 crc kubenswrapper[4717]: I1007 15:10:28.881200 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" path="/var/lib/kubelet/pods/6c23c89b-1e47-491e-b6af-3d143b578acb/volumes" Oct 07 15:10:30 crc kubenswrapper[4717]: I1007 15:10:30.872687 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:10:30 crc kubenswrapper[4717]: E1007 15:10:30.873163 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.613439 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xz595"] Oct 07 15:10:41 crc kubenswrapper[4717]: E1007 15:10:41.614618 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerName="extract-utilities" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.614635 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerName="extract-utilities" Oct 07 15:10:41 crc kubenswrapper[4717]: E1007 15:10:41.614663 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerName="extract-content" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.614670 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerName="extract-content" Oct 07 15:10:41 crc kubenswrapper[4717]: E1007 15:10:41.614689 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerName="registry-server" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.614698 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerName="registry-server" Oct 07 15:10:41 crc kubenswrapper[4717]: E1007 15:10:41.614713 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerName="registry-server" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.614722 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerName="registry-server" Oct 07 15:10:41 crc kubenswrapper[4717]: E1007 15:10:41.614736 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerName="extract-content" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.614743 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerName="extract-content" Oct 07 15:10:41 crc kubenswrapper[4717]: E1007 15:10:41.614773 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerName="extract-utilities" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.614781 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerName="extract-utilities" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.615035 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="159ac7d8-7aa7-4c22-8781-e87bf7b35651" containerName="registry-server" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.615067 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c23c89b-1e47-491e-b6af-3d143b578acb" containerName="registry-server" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.616878 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.632507 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xz595"] Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.688639 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-utilities\") pod \"certified-operators-xz595\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.689167 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-catalog-content\") pod \"certified-operators-xz595\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.689407 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6p8v\" (UniqueName: \"kubernetes.io/projected/a758d900-3a8a-4024-8068-dad9a13797ff-kube-api-access-j6p8v\") pod \"certified-operators-xz595\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.791650 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-catalog-content\") pod \"certified-operators-xz595\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.791744 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6p8v\" (UniqueName: \"kubernetes.io/projected/a758d900-3a8a-4024-8068-dad9a13797ff-kube-api-access-j6p8v\") pod \"certified-operators-xz595\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.791834 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-utilities\") pod \"certified-operators-xz595\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.792656 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-catalog-content\") pod \"certified-operators-xz595\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.792857 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-utilities\") pod \"certified-operators-xz595\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.828040 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6p8v\" (UniqueName: \"kubernetes.io/projected/a758d900-3a8a-4024-8068-dad9a13797ff-kube-api-access-j6p8v\") pod \"certified-operators-xz595\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:41 crc kubenswrapper[4717]: I1007 15:10:41.953258 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:42 crc kubenswrapper[4717]: I1007 15:10:42.502121 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xz595"] Oct 07 15:10:43 crc kubenswrapper[4717]: I1007 15:10:43.094139 4717 generic.go:334] "Generic (PLEG): container finished" podID="a758d900-3a8a-4024-8068-dad9a13797ff" containerID="ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae" exitCode=0 Oct 07 15:10:43 crc kubenswrapper[4717]: I1007 15:10:43.094213 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xz595" event={"ID":"a758d900-3a8a-4024-8068-dad9a13797ff","Type":"ContainerDied","Data":"ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae"} Oct 07 15:10:43 crc kubenswrapper[4717]: I1007 15:10:43.094476 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xz595" event={"ID":"a758d900-3a8a-4024-8068-dad9a13797ff","Type":"ContainerStarted","Data":"eb756a0fbc9c5ea0801257b6b40a9364fb776f39ecee1392b93fd936e80232c0"} Oct 07 15:10:44 crc kubenswrapper[4717]: I1007 15:10:44.868338 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:10:45 crc kubenswrapper[4717]: I1007 15:10:45.114706 4717 generic.go:334] "Generic (PLEG): container finished" podID="a758d900-3a8a-4024-8068-dad9a13797ff" containerID="f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15" exitCode=0 Oct 07 15:10:45 crc kubenswrapper[4717]: I1007 15:10:45.114819 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xz595" event={"ID":"a758d900-3a8a-4024-8068-dad9a13797ff","Type":"ContainerDied","Data":"f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15"} Oct 07 15:10:46 crc kubenswrapper[4717]: I1007 15:10:46.129581 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"a215eab820b3f3d4a04e853f9fee64dc004bceefcdde25cac2dfd52bbd5c326b"} Oct 07 15:10:47 crc kubenswrapper[4717]: I1007 15:10:47.153536 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xz595" event={"ID":"a758d900-3a8a-4024-8068-dad9a13797ff","Type":"ContainerStarted","Data":"58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565"} Oct 07 15:10:47 crc kubenswrapper[4717]: I1007 15:10:47.180406 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xz595" podStartSLOduration=3.323063903 podStartE2EDuration="6.180382617s" podCreationTimestamp="2025-10-07 15:10:41 +0000 UTC" firstStartedPulling="2025-10-07 15:10:43.098395615 +0000 UTC m=+4624.926321407" lastFinishedPulling="2025-10-07 15:10:45.955714329 +0000 UTC m=+4627.783640121" observedRunningTime="2025-10-07 15:10:47.178067704 +0000 UTC m=+4629.005993516" watchObservedRunningTime="2025-10-07 15:10:47.180382617 +0000 UTC m=+4629.008308409" Oct 07 15:10:51 crc kubenswrapper[4717]: I1007 15:10:51.954195 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:51 crc kubenswrapper[4717]: I1007 15:10:51.954735 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:52 crc kubenswrapper[4717]: I1007 15:10:52.005030 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:52 crc kubenswrapper[4717]: I1007 15:10:52.251726 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:52 crc kubenswrapper[4717]: I1007 15:10:52.303520 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xz595"] Oct 07 15:10:54 crc kubenswrapper[4717]: I1007 15:10:54.212099 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xz595" podUID="a758d900-3a8a-4024-8068-dad9a13797ff" containerName="registry-server" containerID="cri-o://58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565" gracePeriod=2 Oct 07 15:10:54 crc kubenswrapper[4717]: I1007 15:10:54.910293 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:54 crc kubenswrapper[4717]: I1007 15:10:54.989373 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-utilities\") pod \"a758d900-3a8a-4024-8068-dad9a13797ff\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " Oct 07 15:10:54 crc kubenswrapper[4717]: I1007 15:10:54.989523 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-catalog-content\") pod \"a758d900-3a8a-4024-8068-dad9a13797ff\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " Oct 07 15:10:54 crc kubenswrapper[4717]: I1007 15:10:54.989602 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6p8v\" (UniqueName: \"kubernetes.io/projected/a758d900-3a8a-4024-8068-dad9a13797ff-kube-api-access-j6p8v\") pod \"a758d900-3a8a-4024-8068-dad9a13797ff\" (UID: \"a758d900-3a8a-4024-8068-dad9a13797ff\") " Oct 07 15:10:54 crc kubenswrapper[4717]: I1007 15:10:54.990150 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-utilities" (OuterVolumeSpecName: "utilities") pod "a758d900-3a8a-4024-8068-dad9a13797ff" (UID: "a758d900-3a8a-4024-8068-dad9a13797ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:54.996284 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a758d900-3a8a-4024-8068-dad9a13797ff-kube-api-access-j6p8v" (OuterVolumeSpecName: "kube-api-access-j6p8v") pod "a758d900-3a8a-4024-8068-dad9a13797ff" (UID: "a758d900-3a8a-4024-8068-dad9a13797ff"). InnerVolumeSpecName "kube-api-access-j6p8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.059391 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a758d900-3a8a-4024-8068-dad9a13797ff" (UID: "a758d900-3a8a-4024-8068-dad9a13797ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.091820 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.091860 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6p8v\" (UniqueName: \"kubernetes.io/projected/a758d900-3a8a-4024-8068-dad9a13797ff-kube-api-access-j6p8v\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.091874 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a758d900-3a8a-4024-8068-dad9a13797ff-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.222460 4717 generic.go:334] "Generic (PLEG): container finished" podID="a758d900-3a8a-4024-8068-dad9a13797ff" containerID="58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565" exitCode=0 Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.222495 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xz595" event={"ID":"a758d900-3a8a-4024-8068-dad9a13797ff","Type":"ContainerDied","Data":"58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565"} Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.222521 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xz595" event={"ID":"a758d900-3a8a-4024-8068-dad9a13797ff","Type":"ContainerDied","Data":"eb756a0fbc9c5ea0801257b6b40a9364fb776f39ecee1392b93fd936e80232c0"} Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.222539 4717 scope.go:117] "RemoveContainer" containerID="58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.222674 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xz595" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.246083 4717 scope.go:117] "RemoveContainer" containerID="f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.260957 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xz595"] Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.272431 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xz595"] Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.279423 4717 scope.go:117] "RemoveContainer" containerID="ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.323657 4717 scope.go:117] "RemoveContainer" containerID="58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565" Oct 07 15:10:55 crc kubenswrapper[4717]: E1007 15:10:55.324137 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565\": container with ID starting with 58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565 not found: ID does not exist" containerID="58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.324184 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565"} err="failed to get container status \"58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565\": rpc error: code = NotFound desc = could not find container \"58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565\": container with ID starting with 58d7b232ec5e48f7b3cc6c33f87788f4779fbcf0a9a198e9147419e2c5347565 not found: ID does not exist" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.324210 4717 scope.go:117] "RemoveContainer" containerID="f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15" Oct 07 15:10:55 crc kubenswrapper[4717]: E1007 15:10:55.324612 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15\": container with ID starting with f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15 not found: ID does not exist" containerID="f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.324651 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15"} err="failed to get container status \"f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15\": rpc error: code = NotFound desc = could not find container \"f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15\": container with ID starting with f38a869ca8a77da088b21caaf46017a310f2dd6e5b84d84ad2c780c2bed62b15 not found: ID does not exist" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.324681 4717 scope.go:117] "RemoveContainer" containerID="ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae" Oct 07 15:10:55 crc kubenswrapper[4717]: E1007 15:10:55.325059 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae\": container with ID starting with ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae not found: ID does not exist" containerID="ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae" Oct 07 15:10:55 crc kubenswrapper[4717]: I1007 15:10:55.325092 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae"} err="failed to get container status \"ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae\": rpc error: code = NotFound desc = could not find container \"ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae\": container with ID starting with ac304cb62fdbe87fc22133ec5ecda4173c004d93f1fe25d4b2026d38721b23ae not found: ID does not exist" Oct 07 15:10:56 crc kubenswrapper[4717]: I1007 15:10:56.879335 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a758d900-3a8a-4024-8068-dad9a13797ff" path="/var/lib/kubelet/pods/a758d900-3a8a-4024-8068-dad9a13797ff/volumes" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.262358 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nrbr2"] Oct 07 15:12:47 crc kubenswrapper[4717]: E1007 15:12:47.263919 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a758d900-3a8a-4024-8068-dad9a13797ff" containerName="extract-content" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.263939 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a758d900-3a8a-4024-8068-dad9a13797ff" containerName="extract-content" Oct 07 15:12:47 crc kubenswrapper[4717]: E1007 15:12:47.264202 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a758d900-3a8a-4024-8068-dad9a13797ff" containerName="extract-utilities" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.264215 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a758d900-3a8a-4024-8068-dad9a13797ff" containerName="extract-utilities" Oct 07 15:12:47 crc kubenswrapper[4717]: E1007 15:12:47.264278 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a758d900-3a8a-4024-8068-dad9a13797ff" containerName="registry-server" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.264315 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a758d900-3a8a-4024-8068-dad9a13797ff" containerName="registry-server" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.264702 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a758d900-3a8a-4024-8068-dad9a13797ff" containerName="registry-server" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.268923 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.284596 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrbr2"] Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.384171 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-catalog-content\") pod \"community-operators-nrbr2\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.384825 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrjc5\" (UniqueName: \"kubernetes.io/projected/242c5b9d-7a01-4e8f-ba45-c16fef61268b-kube-api-access-jrjc5\") pod \"community-operators-nrbr2\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.385040 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-utilities\") pod \"community-operators-nrbr2\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.487703 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-catalog-content\") pod \"community-operators-nrbr2\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.487773 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrjc5\" (UniqueName: \"kubernetes.io/projected/242c5b9d-7a01-4e8f-ba45-c16fef61268b-kube-api-access-jrjc5\") pod \"community-operators-nrbr2\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.487837 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-utilities\") pod \"community-operators-nrbr2\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.488406 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-utilities\") pod \"community-operators-nrbr2\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.488681 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-catalog-content\") pod \"community-operators-nrbr2\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.530385 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrjc5\" (UniqueName: \"kubernetes.io/projected/242c5b9d-7a01-4e8f-ba45-c16fef61268b-kube-api-access-jrjc5\") pod \"community-operators-nrbr2\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:47 crc kubenswrapper[4717]: I1007 15:12:47.607704 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:48 crc kubenswrapper[4717]: I1007 15:12:48.095512 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nrbr2"] Oct 07 15:12:49 crc kubenswrapper[4717]: I1007 15:12:49.263825 4717 generic.go:334] "Generic (PLEG): container finished" podID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerID="aca05d3d5607d458e03a506a05662258803bdd7199046e3c3e68b4114ead68e3" exitCode=0 Oct 07 15:12:49 crc kubenswrapper[4717]: I1007 15:12:49.263919 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrbr2" event={"ID":"242c5b9d-7a01-4e8f-ba45-c16fef61268b","Type":"ContainerDied","Data":"aca05d3d5607d458e03a506a05662258803bdd7199046e3c3e68b4114ead68e3"} Oct 07 15:12:49 crc kubenswrapper[4717]: I1007 15:12:49.264316 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrbr2" event={"ID":"242c5b9d-7a01-4e8f-ba45-c16fef61268b","Type":"ContainerStarted","Data":"4acc82ed7282fa543b849828ff83ce4595a613107087a92cd921587d590a7bbe"} Oct 07 15:12:50 crc kubenswrapper[4717]: I1007 15:12:50.276456 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrbr2" event={"ID":"242c5b9d-7a01-4e8f-ba45-c16fef61268b","Type":"ContainerStarted","Data":"eef669cc8762a88e64a2efb25928804887cbef1124cd462e6c0c1a7764f99767"} Oct 07 15:12:52 crc kubenswrapper[4717]: I1007 15:12:52.313427 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrbr2" event={"ID":"242c5b9d-7a01-4e8f-ba45-c16fef61268b","Type":"ContainerDied","Data":"eef669cc8762a88e64a2efb25928804887cbef1124cd462e6c0c1a7764f99767"} Oct 07 15:12:52 crc kubenswrapper[4717]: I1007 15:12:52.313369 4717 generic.go:334] "Generic (PLEG): container finished" podID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerID="eef669cc8762a88e64a2efb25928804887cbef1124cd462e6c0c1a7764f99767" exitCode=0 Oct 07 15:12:53 crc kubenswrapper[4717]: I1007 15:12:53.326200 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrbr2" event={"ID":"242c5b9d-7a01-4e8f-ba45-c16fef61268b","Type":"ContainerStarted","Data":"7c6c82dbb6874166101f24afeb3623fa99e99137c97483ab63cf15c7b09aa2c6"} Oct 07 15:12:53 crc kubenswrapper[4717]: I1007 15:12:53.354436 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nrbr2" podStartSLOduration=2.878451222 podStartE2EDuration="6.354414728s" podCreationTimestamp="2025-10-07 15:12:47 +0000 UTC" firstStartedPulling="2025-10-07 15:12:49.270112748 +0000 UTC m=+4751.098038540" lastFinishedPulling="2025-10-07 15:12:52.746076254 +0000 UTC m=+4754.574002046" observedRunningTime="2025-10-07 15:12:53.342956414 +0000 UTC m=+4755.170882206" watchObservedRunningTime="2025-10-07 15:12:53.354414728 +0000 UTC m=+4755.182340520" Oct 07 15:12:57 crc kubenswrapper[4717]: I1007 15:12:57.608978 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:57 crc kubenswrapper[4717]: I1007 15:12:57.609953 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:57 crc kubenswrapper[4717]: I1007 15:12:57.657575 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:58 crc kubenswrapper[4717]: I1007 15:12:58.413289 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:12:58 crc kubenswrapper[4717]: I1007 15:12:58.476479 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nrbr2"] Oct 07 15:13:00 crc kubenswrapper[4717]: I1007 15:13:00.383163 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nrbr2" podUID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerName="registry-server" containerID="cri-o://7c6c82dbb6874166101f24afeb3623fa99e99137c97483ab63cf15c7b09aa2c6" gracePeriod=2 Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.397064 4717 generic.go:334] "Generic (PLEG): container finished" podID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerID="7c6c82dbb6874166101f24afeb3623fa99e99137c97483ab63cf15c7b09aa2c6" exitCode=0 Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.398078 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrbr2" event={"ID":"242c5b9d-7a01-4e8f-ba45-c16fef61268b","Type":"ContainerDied","Data":"7c6c82dbb6874166101f24afeb3623fa99e99137c97483ab63cf15c7b09aa2c6"} Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.610844 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.611888 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.656860 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.696475 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-catalog-content\") pod \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.696577 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-utilities\") pod \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.696668 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrjc5\" (UniqueName: \"kubernetes.io/projected/242c5b9d-7a01-4e8f-ba45-c16fef61268b-kube-api-access-jrjc5\") pod \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\" (UID: \"242c5b9d-7a01-4e8f-ba45-c16fef61268b\") " Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.706643 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-utilities" (OuterVolumeSpecName: "utilities") pod "242c5b9d-7a01-4e8f-ba45-c16fef61268b" (UID: "242c5b9d-7a01-4e8f-ba45-c16fef61268b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.714359 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/242c5b9d-7a01-4e8f-ba45-c16fef61268b-kube-api-access-jrjc5" (OuterVolumeSpecName: "kube-api-access-jrjc5") pod "242c5b9d-7a01-4e8f-ba45-c16fef61268b" (UID: "242c5b9d-7a01-4e8f-ba45-c16fef61268b"). InnerVolumeSpecName "kube-api-access-jrjc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.790333 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "242c5b9d-7a01-4e8f-ba45-c16fef61268b" (UID: "242c5b9d-7a01-4e8f-ba45-c16fef61268b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.798894 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.798938 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/242c5b9d-7a01-4e8f-ba45-c16fef61268b-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:01 crc kubenswrapper[4717]: I1007 15:13:01.798953 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrjc5\" (UniqueName: \"kubernetes.io/projected/242c5b9d-7a01-4e8f-ba45-c16fef61268b-kube-api-access-jrjc5\") on node \"crc\" DevicePath \"\"" Oct 07 15:13:02 crc kubenswrapper[4717]: I1007 15:13:02.409512 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nrbr2" event={"ID":"242c5b9d-7a01-4e8f-ba45-c16fef61268b","Type":"ContainerDied","Data":"4acc82ed7282fa543b849828ff83ce4595a613107087a92cd921587d590a7bbe"} Oct 07 15:13:02 crc kubenswrapper[4717]: I1007 15:13:02.409606 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nrbr2" Oct 07 15:13:02 crc kubenswrapper[4717]: I1007 15:13:02.409814 4717 scope.go:117] "RemoveContainer" containerID="7c6c82dbb6874166101f24afeb3623fa99e99137c97483ab63cf15c7b09aa2c6" Oct 07 15:13:02 crc kubenswrapper[4717]: I1007 15:13:02.440464 4717 scope.go:117] "RemoveContainer" containerID="eef669cc8762a88e64a2efb25928804887cbef1124cd462e6c0c1a7764f99767" Oct 07 15:13:02 crc kubenswrapper[4717]: I1007 15:13:02.453859 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nrbr2"] Oct 07 15:13:02 crc kubenswrapper[4717]: I1007 15:13:02.468657 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nrbr2"] Oct 07 15:13:02 crc kubenswrapper[4717]: I1007 15:13:02.493431 4717 scope.go:117] "RemoveContainer" containerID="aca05d3d5607d458e03a506a05662258803bdd7199046e3c3e68b4114ead68e3" Oct 07 15:13:02 crc kubenswrapper[4717]: I1007 15:13:02.883233 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" path="/var/lib/kubelet/pods/242c5b9d-7a01-4e8f-ba45-c16fef61268b/volumes" Oct 07 15:13:31 crc kubenswrapper[4717]: I1007 15:13:31.610274 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:13:31 crc kubenswrapper[4717]: I1007 15:13:31.611275 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:14:01 crc kubenswrapper[4717]: I1007 15:14:01.610270 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:14:01 crc kubenswrapper[4717]: I1007 15:14:01.610848 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:14:01 crc kubenswrapper[4717]: I1007 15:14:01.610897 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 15:14:01 crc kubenswrapper[4717]: I1007 15:14:01.611664 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a215eab820b3f3d4a04e853f9fee64dc004bceefcdde25cac2dfd52bbd5c326b"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:14:01 crc kubenswrapper[4717]: I1007 15:14:01.611716 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://a215eab820b3f3d4a04e853f9fee64dc004bceefcdde25cac2dfd52bbd5c326b" gracePeriod=600 Oct 07 15:14:01 crc kubenswrapper[4717]: I1007 15:14:01.938459 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="a215eab820b3f3d4a04e853f9fee64dc004bceefcdde25cac2dfd52bbd5c326b" exitCode=0 Oct 07 15:14:01 crc kubenswrapper[4717]: I1007 15:14:01.938518 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"a215eab820b3f3d4a04e853f9fee64dc004bceefcdde25cac2dfd52bbd5c326b"} Oct 07 15:14:01 crc kubenswrapper[4717]: I1007 15:14:01.938806 4717 scope.go:117] "RemoveContainer" containerID="6ff1cf198d339606cd403f5ec1d5365c3ec1fe514e4c7f46025faaefe4fa2d25" Oct 07 15:14:02 crc kubenswrapper[4717]: I1007 15:14:02.950111 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a"} Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.173858 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4"] Oct 07 15:15:00 crc kubenswrapper[4717]: E1007 15:15:00.175188 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerName="extract-content" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.175210 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerName="extract-content" Oct 07 15:15:00 crc kubenswrapper[4717]: E1007 15:15:00.175251 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerName="registry-server" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.175260 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerName="registry-server" Oct 07 15:15:00 crc kubenswrapper[4717]: E1007 15:15:00.175282 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerName="extract-utilities" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.175291 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerName="extract-utilities" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.175642 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="242c5b9d-7a01-4e8f-ba45-c16fef61268b" containerName="registry-server" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.176748 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.179585 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.180720 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.191402 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4"] Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.240580 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v78zg\" (UniqueName: \"kubernetes.io/projected/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-kube-api-access-v78zg\") pod \"collect-profiles-29330835-jkqz4\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.240695 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-secret-volume\") pod \"collect-profiles-29330835-jkqz4\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.240800 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-config-volume\") pod \"collect-profiles-29330835-jkqz4\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.346305 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v78zg\" (UniqueName: \"kubernetes.io/projected/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-kube-api-access-v78zg\") pod \"collect-profiles-29330835-jkqz4\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.346402 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-secret-volume\") pod \"collect-profiles-29330835-jkqz4\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.346502 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-config-volume\") pod \"collect-profiles-29330835-jkqz4\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.347554 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-config-volume\") pod \"collect-profiles-29330835-jkqz4\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.401144 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-secret-volume\") pod \"collect-profiles-29330835-jkqz4\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.406031 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v78zg\" (UniqueName: \"kubernetes.io/projected/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-kube-api-access-v78zg\") pod \"collect-profiles-29330835-jkqz4\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:00 crc kubenswrapper[4717]: I1007 15:15:00.499448 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:01 crc kubenswrapper[4717]: I1007 15:15:01.004261 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4"] Oct 07 15:15:01 crc kubenswrapper[4717]: W1007 15:15:01.006228 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a36a930_61ea_42cb_aab1_a9a1d1e6046a.slice/crio-fbd500de50404209743dc089044e508a893089899f262cb13b3aedd5e8eaab1b WatchSource:0}: Error finding container fbd500de50404209743dc089044e508a893089899f262cb13b3aedd5e8eaab1b: Status 404 returned error can't find the container with id fbd500de50404209743dc089044e508a893089899f262cb13b3aedd5e8eaab1b Oct 07 15:15:01 crc kubenswrapper[4717]: I1007 15:15:01.483298 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" event={"ID":"3a36a930-61ea-42cb-aab1-a9a1d1e6046a","Type":"ContainerStarted","Data":"fbd500de50404209743dc089044e508a893089899f262cb13b3aedd5e8eaab1b"} Oct 07 15:15:02 crc kubenswrapper[4717]: I1007 15:15:02.492881 4717 generic.go:334] "Generic (PLEG): container finished" podID="3a36a930-61ea-42cb-aab1-a9a1d1e6046a" containerID="b172571f67f895985b8e7f9ccdd0f6efb5f2540dbc5c8d1f5fd4166dde50c218" exitCode=0 Oct 07 15:15:02 crc kubenswrapper[4717]: I1007 15:15:02.492987 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" event={"ID":"3a36a930-61ea-42cb-aab1-a9a1d1e6046a","Type":"ContainerDied","Data":"b172571f67f895985b8e7f9ccdd0f6efb5f2540dbc5c8d1f5fd4166dde50c218"} Oct 07 15:15:03 crc kubenswrapper[4717]: I1007 15:15:03.887972 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:03 crc kubenswrapper[4717]: I1007 15:15:03.921000 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-secret-volume\") pod \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " Oct 07 15:15:03 crc kubenswrapper[4717]: I1007 15:15:03.921073 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-config-volume\") pod \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " Oct 07 15:15:03 crc kubenswrapper[4717]: I1007 15:15:03.921116 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v78zg\" (UniqueName: \"kubernetes.io/projected/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-kube-api-access-v78zg\") pod \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\" (UID: \"3a36a930-61ea-42cb-aab1-a9a1d1e6046a\") " Oct 07 15:15:03 crc kubenswrapper[4717]: I1007 15:15:03.922024 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a36a930-61ea-42cb-aab1-a9a1d1e6046a" (UID: "3a36a930-61ea-42cb-aab1-a9a1d1e6046a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:15:03 crc kubenswrapper[4717]: I1007 15:15:03.929707 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a36a930-61ea-42cb-aab1-a9a1d1e6046a" (UID: "3a36a930-61ea-42cb-aab1-a9a1d1e6046a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:15:03 crc kubenswrapper[4717]: I1007 15:15:03.936435 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-kube-api-access-v78zg" (OuterVolumeSpecName: "kube-api-access-v78zg") pod "3a36a930-61ea-42cb-aab1-a9a1d1e6046a" (UID: "3a36a930-61ea-42cb-aab1-a9a1d1e6046a"). InnerVolumeSpecName "kube-api-access-v78zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:15:04 crc kubenswrapper[4717]: I1007 15:15:04.023638 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:04 crc kubenswrapper[4717]: I1007 15:15:04.023700 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:04 crc kubenswrapper[4717]: I1007 15:15:04.023714 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v78zg\" (UniqueName: \"kubernetes.io/projected/3a36a930-61ea-42cb-aab1-a9a1d1e6046a-kube-api-access-v78zg\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:04 crc kubenswrapper[4717]: I1007 15:15:04.515884 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" event={"ID":"3a36a930-61ea-42cb-aab1-a9a1d1e6046a","Type":"ContainerDied","Data":"fbd500de50404209743dc089044e508a893089899f262cb13b3aedd5e8eaab1b"} Oct 07 15:15:04 crc kubenswrapper[4717]: I1007 15:15:04.516621 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbd500de50404209743dc089044e508a893089899f262cb13b3aedd5e8eaab1b" Oct 07 15:15:04 crc kubenswrapper[4717]: I1007 15:15:04.515938 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330835-jkqz4" Oct 07 15:15:05 crc kubenswrapper[4717]: I1007 15:15:05.015127 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh"] Oct 07 15:15:05 crc kubenswrapper[4717]: I1007 15:15:05.034507 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330790-m9pdh"] Oct 07 15:15:06 crc kubenswrapper[4717]: I1007 15:15:06.881173 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b5f91a-e982-4972-8886-78655079fb8d" path="/var/lib/kubelet/pods/d1b5f91a-e982-4972-8886-78655079fb8d/volumes" Oct 07 15:15:44 crc kubenswrapper[4717]: I1007 15:15:44.905333 4717 generic.go:334] "Generic (PLEG): container finished" podID="1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" containerID="826cda5c8ba26449da2b79bb50d05cc5d4c96cf4c3371a5e201a2643a7e0d86a" exitCode=0 Oct 07 15:15:44 crc kubenswrapper[4717]: I1007 15:15:44.905456 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7","Type":"ContainerDied","Data":"826cda5c8ba26449da2b79bb50d05cc5d4c96cf4c3371a5e201a2643a7e0d86a"} Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.268985 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.330406 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-config-data\") pod \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.330494 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-temporary\") pod \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.330523 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.330564 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ca-certs\") pod \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.330632 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config\") pod \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.330655 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ssh-key\") pod \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.330698 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config-secret\") pod \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.330727 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-workdir\") pod \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.330768 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9bnw\" (UniqueName: \"kubernetes.io/projected/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-kube-api-access-t9bnw\") pod \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\" (UID: \"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7\") " Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.336812 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-config-data" (OuterVolumeSpecName: "config-data") pod "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" (UID: "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.343471 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" (UID: "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.346123 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" (UID: "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.346289 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" (UID: "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.367242 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-kube-api-access-t9bnw" (OuterVolumeSpecName: "kube-api-access-t9bnw") pod "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" (UID: "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7"). InnerVolumeSpecName "kube-api-access-t9bnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.377044 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" (UID: "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.377118 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" (UID: "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.388190 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" (UID: "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.393394 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" (UID: "1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.394513 4717 scope.go:117] "RemoveContainer" containerID="f2211b45882616aabaa640ab8c1ffefc19cb940f7a3bc695ccb94e11df9f5044" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.433280 4717 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.433332 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.433348 4717 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.433359 4717 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.433372 4717 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.433385 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9bnw\" (UniqueName: \"kubernetes.io/projected/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-kube-api-access-t9bnw\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.433407 4717 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-config-data\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.433417 4717 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.433461 4717 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.460664 4717 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.536292 4717 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.921974 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7","Type":"ContainerDied","Data":"5f2594948c35c198d4ac8bf1a8dac623ed44474151b6ce9862382908ba800b06"} Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.922046 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f2594948c35c198d4ac8bf1a8dac623ed44474151b6ce9862382908ba800b06" Oct 07 15:15:46 crc kubenswrapper[4717]: I1007 15:15:46.922102 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 07 15:15:48 crc kubenswrapper[4717]: I1007 15:15:48.944838 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 15:15:48 crc kubenswrapper[4717]: E1007 15:15:48.946880 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a36a930-61ea-42cb-aab1-a9a1d1e6046a" containerName="collect-profiles" Oct 07 15:15:48 crc kubenswrapper[4717]: I1007 15:15:48.946920 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a36a930-61ea-42cb-aab1-a9a1d1e6046a" containerName="collect-profiles" Oct 07 15:15:48 crc kubenswrapper[4717]: E1007 15:15:48.946945 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" containerName="tempest-tests-tempest-tests-runner" Oct 07 15:15:48 crc kubenswrapper[4717]: I1007 15:15:48.946953 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" containerName="tempest-tests-tempest-tests-runner" Oct 07 15:15:48 crc kubenswrapper[4717]: I1007 15:15:48.948496 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7" containerName="tempest-tests-tempest-tests-runner" Oct 07 15:15:48 crc kubenswrapper[4717]: I1007 15:15:48.948523 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a36a930-61ea-42cb-aab1-a9a1d1e6046a" containerName="collect-profiles" Oct 07 15:15:48 crc kubenswrapper[4717]: I1007 15:15:48.949358 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:15:48 crc kubenswrapper[4717]: I1007 15:15:48.959959 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.085091 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb49f\" (UniqueName: \"kubernetes.io/projected/5f86d81b-0adf-45b3-815e-4ba7029d821d-kube-api-access-gb49f\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5f86d81b-0adf-45b3-815e-4ba7029d821d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.085143 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5f86d81b-0adf-45b3-815e-4ba7029d821d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.187108 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb49f\" (UniqueName: \"kubernetes.io/projected/5f86d81b-0adf-45b3-815e-4ba7029d821d-kube-api-access-gb49f\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5f86d81b-0adf-45b3-815e-4ba7029d821d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.187469 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5f86d81b-0adf-45b3-815e-4ba7029d821d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.187833 4717 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5f86d81b-0adf-45b3-815e-4ba7029d821d\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.211285 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb49f\" (UniqueName: \"kubernetes.io/projected/5f86d81b-0adf-45b3-815e-4ba7029d821d-kube-api-access-gb49f\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5f86d81b-0adf-45b3-815e-4ba7029d821d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.235263 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"5f86d81b-0adf-45b3-815e-4ba7029d821d\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.271956 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.737893 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.746900 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:15:49 crc kubenswrapper[4717]: I1007 15:15:49.956468 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"5f86d81b-0adf-45b3-815e-4ba7029d821d","Type":"ContainerStarted","Data":"a1fa3f9d0b08cd7677c9b8a196433c8c77f123f399baabeb38371169fa8e90ac"} Oct 07 15:15:51 crc kubenswrapper[4717]: I1007 15:15:51.978814 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"5f86d81b-0adf-45b3-815e-4ba7029d821d","Type":"ContainerStarted","Data":"5e7f1a5e54ffc714fd755d1a0a44e4da717d66fc449a39a539d3a5da8d9287c8"} Oct 07 15:15:52 crc kubenswrapper[4717]: I1007 15:15:52.001473 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.716843557 podStartE2EDuration="4.001451198s" podCreationTimestamp="2025-10-07 15:15:48 +0000 UTC" firstStartedPulling="2025-10-07 15:15:49.746652485 +0000 UTC m=+4931.574578287" lastFinishedPulling="2025-10-07 15:15:51.031260136 +0000 UTC m=+4932.859185928" observedRunningTime="2025-10-07 15:15:51.99496633 +0000 UTC m=+4933.822892182" watchObservedRunningTime="2025-10-07 15:15:52.001451198 +0000 UTC m=+4933.829376990" Oct 07 15:16:01 crc kubenswrapper[4717]: I1007 15:16:01.609742 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:16:01 crc kubenswrapper[4717]: I1007 15:16:01.610607 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:16:10 crc kubenswrapper[4717]: I1007 15:16:10.809455 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8rbch/must-gather-n7ldt"] Oct 07 15:16:10 crc kubenswrapper[4717]: I1007 15:16:10.812512 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/must-gather-n7ldt" Oct 07 15:16:10 crc kubenswrapper[4717]: I1007 15:16:10.814918 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8rbch"/"kube-root-ca.crt" Oct 07 15:16:10 crc kubenswrapper[4717]: I1007 15:16:10.815231 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-8rbch"/"default-dockercfg-bzt2d" Oct 07 15:16:10 crc kubenswrapper[4717]: I1007 15:16:10.815312 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-8rbch"/"openshift-service-ca.crt" Oct 07 15:16:10 crc kubenswrapper[4717]: I1007 15:16:10.820104 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8rbch/must-gather-n7ldt"] Oct 07 15:16:10 crc kubenswrapper[4717]: I1007 15:16:10.963462 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d7e9085d-a33d-4850-9c89-0f29ccddf977-must-gather-output\") pod \"must-gather-n7ldt\" (UID: \"d7e9085d-a33d-4850-9c89-0f29ccddf977\") " pod="openshift-must-gather-8rbch/must-gather-n7ldt" Oct 07 15:16:10 crc kubenswrapper[4717]: I1007 15:16:10.963632 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjlqp\" (UniqueName: \"kubernetes.io/projected/d7e9085d-a33d-4850-9c89-0f29ccddf977-kube-api-access-mjlqp\") pod \"must-gather-n7ldt\" (UID: \"d7e9085d-a33d-4850-9c89-0f29ccddf977\") " pod="openshift-must-gather-8rbch/must-gather-n7ldt" Oct 07 15:16:11 crc kubenswrapper[4717]: I1007 15:16:11.065592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjlqp\" (UniqueName: \"kubernetes.io/projected/d7e9085d-a33d-4850-9c89-0f29ccddf977-kube-api-access-mjlqp\") pod \"must-gather-n7ldt\" (UID: \"d7e9085d-a33d-4850-9c89-0f29ccddf977\") " pod="openshift-must-gather-8rbch/must-gather-n7ldt" Oct 07 15:16:11 crc kubenswrapper[4717]: I1007 15:16:11.065967 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d7e9085d-a33d-4850-9c89-0f29ccddf977-must-gather-output\") pod \"must-gather-n7ldt\" (UID: \"d7e9085d-a33d-4850-9c89-0f29ccddf977\") " pod="openshift-must-gather-8rbch/must-gather-n7ldt" Oct 07 15:16:11 crc kubenswrapper[4717]: I1007 15:16:11.067054 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d7e9085d-a33d-4850-9c89-0f29ccddf977-must-gather-output\") pod \"must-gather-n7ldt\" (UID: \"d7e9085d-a33d-4850-9c89-0f29ccddf977\") " pod="openshift-must-gather-8rbch/must-gather-n7ldt" Oct 07 15:16:11 crc kubenswrapper[4717]: I1007 15:16:11.085974 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjlqp\" (UniqueName: \"kubernetes.io/projected/d7e9085d-a33d-4850-9c89-0f29ccddf977-kube-api-access-mjlqp\") pod \"must-gather-n7ldt\" (UID: \"d7e9085d-a33d-4850-9c89-0f29ccddf977\") " pod="openshift-must-gather-8rbch/must-gather-n7ldt" Oct 07 15:16:11 crc kubenswrapper[4717]: I1007 15:16:11.131289 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/must-gather-n7ldt" Oct 07 15:16:12 crc kubenswrapper[4717]: I1007 15:16:12.099469 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8rbch/must-gather-n7ldt"] Oct 07 15:16:12 crc kubenswrapper[4717]: I1007 15:16:12.172411 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/must-gather-n7ldt" event={"ID":"d7e9085d-a33d-4850-9c89-0f29ccddf977","Type":"ContainerStarted","Data":"106e678eafad10c80f158d128906f3005d87758964f7bc24d4f40f102bfcd9f2"} Oct 07 15:16:16 crc kubenswrapper[4717]: I1007 15:16:16.225035 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/must-gather-n7ldt" event={"ID":"d7e9085d-a33d-4850-9c89-0f29ccddf977","Type":"ContainerStarted","Data":"9ed70b47cd1d10f73babbd4a0bdc5cb59d1328ef1d0fdee8da8e5326dda5a594"} Oct 07 15:16:17 crc kubenswrapper[4717]: I1007 15:16:17.262214 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/must-gather-n7ldt" event={"ID":"d7e9085d-a33d-4850-9c89-0f29ccddf977","Type":"ContainerStarted","Data":"af38d3008dbfed0d2d7ad43f2727a0aab6bb34072ea0489482833035092bdb25"} Oct 07 15:16:17 crc kubenswrapper[4717]: I1007 15:16:17.298466 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8rbch/must-gather-n7ldt" podStartSLOduration=3.530739344 podStartE2EDuration="7.298447977s" podCreationTimestamp="2025-10-07 15:16:10 +0000 UTC" firstStartedPulling="2025-10-07 15:16:12.103741831 +0000 UTC m=+4953.931667623" lastFinishedPulling="2025-10-07 15:16:15.871450474 +0000 UTC m=+4957.699376256" observedRunningTime="2025-10-07 15:16:17.284308169 +0000 UTC m=+4959.112233981" watchObservedRunningTime="2025-10-07 15:16:17.298447977 +0000 UTC m=+4959.126373759" Oct 07 15:16:22 crc kubenswrapper[4717]: I1007 15:16:22.761185 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8rbch/crc-debug-xmg6f"] Oct 07 15:16:22 crc kubenswrapper[4717]: I1007 15:16:22.762969 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-xmg6f" Oct 07 15:16:22 crc kubenswrapper[4717]: I1007 15:16:22.921890 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8m9x\" (UniqueName: \"kubernetes.io/projected/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-kube-api-access-t8m9x\") pod \"crc-debug-xmg6f\" (UID: \"a6cb4061-31ca-45b1-b17a-f55d8f04bb05\") " pod="openshift-must-gather-8rbch/crc-debug-xmg6f" Oct 07 15:16:22 crc kubenswrapper[4717]: I1007 15:16:22.922024 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-host\") pod \"crc-debug-xmg6f\" (UID: \"a6cb4061-31ca-45b1-b17a-f55d8f04bb05\") " pod="openshift-must-gather-8rbch/crc-debug-xmg6f" Oct 07 15:16:23 crc kubenswrapper[4717]: I1007 15:16:23.023515 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-host\") pod \"crc-debug-xmg6f\" (UID: \"a6cb4061-31ca-45b1-b17a-f55d8f04bb05\") " pod="openshift-must-gather-8rbch/crc-debug-xmg6f" Oct 07 15:16:23 crc kubenswrapper[4717]: I1007 15:16:23.023642 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8m9x\" (UniqueName: \"kubernetes.io/projected/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-kube-api-access-t8m9x\") pod \"crc-debug-xmg6f\" (UID: \"a6cb4061-31ca-45b1-b17a-f55d8f04bb05\") " pod="openshift-must-gather-8rbch/crc-debug-xmg6f" Oct 07 15:16:23 crc kubenswrapper[4717]: I1007 15:16:23.024139 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-host\") pod \"crc-debug-xmg6f\" (UID: \"a6cb4061-31ca-45b1-b17a-f55d8f04bb05\") " pod="openshift-must-gather-8rbch/crc-debug-xmg6f" Oct 07 15:16:23 crc kubenswrapper[4717]: I1007 15:16:23.044229 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8m9x\" (UniqueName: \"kubernetes.io/projected/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-kube-api-access-t8m9x\") pod \"crc-debug-xmg6f\" (UID: \"a6cb4061-31ca-45b1-b17a-f55d8f04bb05\") " pod="openshift-must-gather-8rbch/crc-debug-xmg6f" Oct 07 15:16:23 crc kubenswrapper[4717]: I1007 15:16:23.084134 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-xmg6f" Oct 07 15:16:23 crc kubenswrapper[4717]: I1007 15:16:23.315964 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/crc-debug-xmg6f" event={"ID":"a6cb4061-31ca-45b1-b17a-f55d8f04bb05","Type":"ContainerStarted","Data":"c27282504b02839a6017764e2149ad98cec7a6b415cc23375323b86146027263"} Oct 07 15:16:31 crc kubenswrapper[4717]: I1007 15:16:31.609668 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:16:31 crc kubenswrapper[4717]: I1007 15:16:31.610257 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:16:34 crc kubenswrapper[4717]: I1007 15:16:34.439667 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/crc-debug-xmg6f" event={"ID":"a6cb4061-31ca-45b1-b17a-f55d8f04bb05","Type":"ContainerStarted","Data":"9a45528d3ae9b5371191780f5ba5358779eca6ca1b0714ff4c74c4962e22e495"} Oct 07 15:16:34 crc kubenswrapper[4717]: I1007 15:16:34.455228 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8rbch/crc-debug-xmg6f" podStartSLOduration=1.41419402 podStartE2EDuration="12.455204972s" podCreationTimestamp="2025-10-07 15:16:22 +0000 UTC" firstStartedPulling="2025-10-07 15:16:23.147661684 +0000 UTC m=+4964.975587476" lastFinishedPulling="2025-10-07 15:16:34.188672636 +0000 UTC m=+4976.016598428" observedRunningTime="2025-10-07 15:16:34.452314663 +0000 UTC m=+4976.280240455" watchObservedRunningTime="2025-10-07 15:16:34.455204972 +0000 UTC m=+4976.283130774" Oct 07 15:17:01 crc kubenswrapper[4717]: I1007 15:17:01.610248 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:17:01 crc kubenswrapper[4717]: I1007 15:17:01.611071 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:17:01 crc kubenswrapper[4717]: I1007 15:17:01.611176 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 15:17:01 crc kubenswrapper[4717]: I1007 15:17:01.612108 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:17:01 crc kubenswrapper[4717]: I1007 15:17:01.612442 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" gracePeriod=600 Oct 07 15:17:02 crc kubenswrapper[4717]: E1007 15:17:02.268761 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:17:02 crc kubenswrapper[4717]: I1007 15:17:02.724252 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" exitCode=0 Oct 07 15:17:02 crc kubenswrapper[4717]: I1007 15:17:02.724331 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a"} Oct 07 15:17:02 crc kubenswrapper[4717]: I1007 15:17:02.724813 4717 scope.go:117] "RemoveContainer" containerID="a215eab820b3f3d4a04e853f9fee64dc004bceefcdde25cac2dfd52bbd5c326b" Oct 07 15:17:02 crc kubenswrapper[4717]: I1007 15:17:02.725546 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:17:02 crc kubenswrapper[4717]: E1007 15:17:02.725817 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:17:12 crc kubenswrapper[4717]: I1007 15:17:12.869211 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:17:12 crc kubenswrapper[4717]: E1007 15:17:12.870075 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:17:25 crc kubenswrapper[4717]: I1007 15:17:25.868791 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:17:25 crc kubenswrapper[4717]: E1007 15:17:25.869817 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:17:32 crc kubenswrapper[4717]: I1007 15:17:32.149064 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbb74df64-qfwg8_2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb/barbican-api/0.log" Oct 07 15:17:32 crc kubenswrapper[4717]: I1007 15:17:32.193242 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbb74df64-qfwg8_2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb/barbican-api-log/0.log" Oct 07 15:17:32 crc kubenswrapper[4717]: I1007 15:17:32.455151 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8474564558-fss6t_237c55e8-afd6-4798-b0cf-7f8b20b0e323/barbican-keystone-listener/0.log" Oct 07 15:17:32 crc kubenswrapper[4717]: I1007 15:17:32.713513 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-b648ffc67-ks8w7_c96f41ab-26d3-44e4-8ad3-732b104a09df/barbican-worker/0.log" Oct 07 15:17:32 crc kubenswrapper[4717]: I1007 15:17:32.988966 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-b648ffc67-ks8w7_c96f41ab-26d3-44e4-8ad3-732b104a09df/barbican-worker-log/0.log" Oct 07 15:17:33 crc kubenswrapper[4717]: I1007 15:17:33.338043 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp_026b85b1-41cc-4a6f-9638-909bc0e6099e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:33 crc kubenswrapper[4717]: I1007 15:17:33.342516 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8474564558-fss6t_237c55e8-afd6-4798-b0cf-7f8b20b0e323/barbican-keystone-listener-log/0.log" Oct 07 15:17:33 crc kubenswrapper[4717]: I1007 15:17:33.594132 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3c0fa622-b0b4-4100-b7fa-82e3821d1f76/proxy-httpd/0.log" Oct 07 15:17:33 crc kubenswrapper[4717]: I1007 15:17:33.685833 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3c0fa622-b0b4-4100-b7fa-82e3821d1f76/ceilometer-central-agent/0.log" Oct 07 15:17:33 crc kubenswrapper[4717]: I1007 15:17:33.687900 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3c0fa622-b0b4-4100-b7fa-82e3821d1f76/ceilometer-notification-agent/0.log" Oct 07 15:17:33 crc kubenswrapper[4717]: I1007 15:17:33.899829 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3c0fa622-b0b4-4100-b7fa-82e3821d1f76/sg-core/0.log" Oct 07 15:17:34 crc kubenswrapper[4717]: I1007 15:17:34.045191 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_cd7e0d52-4d10-4898-8949-7f3dc9875fe9/ceph/0.log" Oct 07 15:17:34 crc kubenswrapper[4717]: I1007 15:17:34.554550 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2ad858d-25ea-41d2-8499-188e72ca0873/cinder-api/0.log" Oct 07 15:17:34 crc kubenswrapper[4717]: I1007 15:17:34.654338 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2ad858d-25ea-41d2-8499-188e72ca0873/cinder-api-log/0.log" Oct 07 15:17:34 crc kubenswrapper[4717]: I1007 15:17:34.948273 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d6be3c5a-b5c5-49a0-ae43-f90c44cf6496/probe/0.log" Oct 07 15:17:35 crc kubenswrapper[4717]: I1007 15:17:35.156590 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_348f889a-8a84-4e54-81cf-46ee269e85d9/cinder-scheduler/0.log" Oct 07 15:17:35 crc kubenswrapper[4717]: I1007 15:17:35.288234 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_348f889a-8a84-4e54-81cf-46ee269e85d9/probe/0.log" Oct 07 15:17:35 crc kubenswrapper[4717]: I1007 15:17:35.620933 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_315f03ca-ba00-4899-b836-72bc9a1970eb/probe/0.log" Oct 07 15:17:35 crc kubenswrapper[4717]: I1007 15:17:35.813980 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5_c4007561-6cfe-400e-81b9-d60b36d79171/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:36 crc kubenswrapper[4717]: I1007 15:17:36.109144 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6_27f71967-9d43-4b11-a286-1544c15adc41/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:36 crc kubenswrapper[4717]: I1007 15:17:36.339553 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dppcv_b7e5a28d-f380-447f-998f-5e65280d3651/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:36 crc kubenswrapper[4717]: I1007 15:17:36.629139 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-wm5sb_ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6/init/0.log" Oct 07 15:17:36 crc kubenswrapper[4717]: I1007 15:17:36.716988 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-wm5sb_ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6/init/0.log" Oct 07 15:17:37 crc kubenswrapper[4717]: I1007 15:17:37.116741 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-wm5sb_ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6/dnsmasq-dns/0.log" Oct 07 15:17:37 crc kubenswrapper[4717]: I1007 15:17:37.214615 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l_769125a8-8870-4ddf-86e3-cb1bfa198b41/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:37 crc kubenswrapper[4717]: I1007 15:17:37.476394 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c19f33db-3446-415d-8d32-8f18ac112e2e/glance-httpd/0.log" Oct 07 15:17:37 crc kubenswrapper[4717]: I1007 15:17:37.583991 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c19f33db-3446-415d-8d32-8f18ac112e2e/glance-log/0.log" Oct 07 15:17:37 crc kubenswrapper[4717]: I1007 15:17:37.940507 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b5f3e51b-cc26-48b7-ace9-206022bfc021/glance-httpd/0.log" Oct 07 15:17:38 crc kubenswrapper[4717]: I1007 15:17:38.039790 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b5f3e51b-cc26-48b7-ace9-206022bfc021/glance-log/0.log" Oct 07 15:17:38 crc kubenswrapper[4717]: I1007 15:17:38.206363 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d6be3c5a-b5c5-49a0-ae43-f90c44cf6496/cinder-backup/0.log" Oct 07 15:17:38 crc kubenswrapper[4717]: I1007 15:17:38.396418 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-ff6468b6d-j9vqb_fabca24d-42a9-45e6-81ca-ad04bf8bd588/horizon/0.log" Oct 07 15:17:38 crc kubenswrapper[4717]: I1007 15:17:38.630396 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-kljwc_fda137df-3fa6-470a-b41a-db9f55a550ab/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:38 crc kubenswrapper[4717]: I1007 15:17:38.825834 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_315f03ca-ba00-4899-b836-72bc9a1970eb/cinder-volume/0.log" Oct 07 15:17:38 crc kubenswrapper[4717]: I1007 15:17:38.891466 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xfj57_bbf34104-46e2-4650-bdd0-f3f8cfb6d590/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:39 crc kubenswrapper[4717]: I1007 15:17:39.119132 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-ff6468b6d-j9vqb_fabca24d-42a9-45e6-81ca-ad04bf8bd588/horizon-log/0.log" Oct 07 15:17:39 crc kubenswrapper[4717]: I1007 15:17:39.714773 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29330821-ncmw5_6cd311ec-9f03-47a4-8e27-5e4553bf4c1d/keystone-cron/0.log" Oct 07 15:17:39 crc kubenswrapper[4717]: I1007 15:17:39.881513 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9bcc7eb6-640f-4300-9943-2ba004773b3b/kube-state-metrics/0.log" Oct 07 15:17:40 crc kubenswrapper[4717]: I1007 15:17:40.189829 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz_73f6a900-f08d-4207-b89d-d8acfd404b8d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:40 crc kubenswrapper[4717]: I1007 15:17:40.643524 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_3351608d-f560-40bb-a1f3-c8711a80a7a4/manila-api/0.log" Oct 07 15:17:40 crc kubenswrapper[4717]: I1007 15:17:40.869157 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:17:40 crc kubenswrapper[4717]: E1007 15:17:40.869607 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:17:41 crc kubenswrapper[4717]: I1007 15:17:41.012052 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_4dd7329a-b653-4574-864a-9d86cbb87ed3/manila-scheduler/0.log" Oct 07 15:17:41 crc kubenswrapper[4717]: I1007 15:17:41.396558 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_3351608d-f560-40bb-a1f3-c8711a80a7a4/manila-api-log/0.log" Oct 07 15:17:41 crc kubenswrapper[4717]: I1007 15:17:41.562845 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_4dd7329a-b653-4574-864a-9d86cbb87ed3/probe/0.log" Oct 07 15:17:41 crc kubenswrapper[4717]: I1007 15:17:41.900864 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_7fc0839c-9b96-43c5-9111-5d9c24b471b9/probe/0.log" Oct 07 15:17:42 crc kubenswrapper[4717]: I1007 15:17:42.049225 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_7fc0839c-9b96-43c5-9111-5d9c24b471b9/manila-share/0.log" Oct 07 15:17:43 crc kubenswrapper[4717]: I1007 15:17:43.487174 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59986d7f85-m2phf_c8606e01-73db-4dce-92b0-89d47762aa09/neutron-httpd/0.log" Oct 07 15:17:43 crc kubenswrapper[4717]: I1007 15:17:43.609350 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-74d44865f4-vrndk_230706b7-0a7f-4b69-973f-4e9c1253e7b9/keystone-api/0.log" Oct 07 15:17:43 crc kubenswrapper[4717]: I1007 15:17:43.867178 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w_310bfdcc-9c71-4075-b8d3-af7c21dc3165/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:43 crc kubenswrapper[4717]: I1007 15:17:43.987509 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59986d7f85-m2phf_c8606e01-73db-4dce-92b0-89d47762aa09/neutron-api/0.log" Oct 07 15:17:45 crc kubenswrapper[4717]: I1007 15:17:45.095390 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_cb822ac3-55c1-4745-bc6d-570b89e66108/nova-cell0-conductor-conductor/0.log" Oct 07 15:17:45 crc kubenswrapper[4717]: I1007 15:17:45.770119 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_6d3e722a-09cb-4b09-856e-b4752de9e30e/nova-cell1-conductor-conductor/0.log" Oct 07 15:17:46 crc kubenswrapper[4717]: I1007 15:17:46.434665 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d4241d67-38b0-4e1c-83d6-9a0531e6d902/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 15:17:46 crc kubenswrapper[4717]: I1007 15:17:46.899692 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_975022f5-b6f2-4d3f-adbb-f4b878cd2758/nova-api-log/0.log" Oct 07 15:17:47 crc kubenswrapper[4717]: I1007 15:17:47.044329 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-xxwpx_44a1daee-eac0-4c51-ae29-1afa919bcb68/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:47 crc kubenswrapper[4717]: I1007 15:17:47.392080 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0c62351c-f8eb-4014-8391-229d81411849/nova-metadata-log/0.log" Oct 07 15:17:47 crc kubenswrapper[4717]: I1007 15:17:47.533986 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_975022f5-b6f2-4d3f-adbb-f4b878cd2758/nova-api-api/0.log" Oct 07 15:17:48 crc kubenswrapper[4717]: I1007 15:17:48.270640 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3c49618-d6f6-4379-8ac1-b474b0ffdeea/mysql-bootstrap/0.log" Oct 07 15:17:48 crc kubenswrapper[4717]: I1007 15:17:48.457280 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_beebd3d0-0aca-4953-9f04-ea98632ae71b/nova-scheduler-scheduler/0.log" Oct 07 15:17:48 crc kubenswrapper[4717]: I1007 15:17:48.678943 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3c49618-d6f6-4379-8ac1-b474b0ffdeea/mysql-bootstrap/0.log" Oct 07 15:17:48 crc kubenswrapper[4717]: I1007 15:17:48.692259 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3c49618-d6f6-4379-8ac1-b474b0ffdeea/galera/0.log" Oct 07 15:17:49 crc kubenswrapper[4717]: I1007 15:17:49.485795 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3eaa6cb6-249b-4f92-8942-9c60eee866e8/mysql-bootstrap/0.log" Oct 07 15:17:49 crc kubenswrapper[4717]: I1007 15:17:49.673614 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3eaa6cb6-249b-4f92-8942-9c60eee866e8/mysql-bootstrap/0.log" Oct 07 15:17:49 crc kubenswrapper[4717]: I1007 15:17:49.788848 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3eaa6cb6-249b-4f92-8942-9c60eee866e8/galera/0.log" Oct 07 15:17:49 crc kubenswrapper[4717]: I1007 15:17:49.842649 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0c62351c-f8eb-4014-8391-229d81411849/nova-metadata-metadata/0.log" Oct 07 15:17:50 crc kubenswrapper[4717]: I1007 15:17:50.044121 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2/openstackclient/0.log" Oct 07 15:17:50 crc kubenswrapper[4717]: I1007 15:17:50.298766 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lh5mj_190219f8-b7b5-4cbd-ab5b-3fd1880f9eef/openstack-network-exporter/0.log" Oct 07 15:17:50 crc kubenswrapper[4717]: I1007 15:17:50.370795 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wvknm_1c9dfdb0-e3ce-4c80-903f-006f20eacf29/ovsdb-server-init/0.log" Oct 07 15:17:50 crc kubenswrapper[4717]: I1007 15:17:50.542092 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wvknm_1c9dfdb0-e3ce-4c80-903f-006f20eacf29/ovsdb-server-init/0.log" Oct 07 15:17:50 crc kubenswrapper[4717]: I1007 15:17:50.602864 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wvknm_1c9dfdb0-e3ce-4c80-903f-006f20eacf29/ovs-vswitchd/0.log" Oct 07 15:17:50 crc kubenswrapper[4717]: I1007 15:17:50.642995 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wvknm_1c9dfdb0-e3ce-4c80-903f-006f20eacf29/ovsdb-server/0.log" Oct 07 15:17:51 crc kubenswrapper[4717]: I1007 15:17:51.293172 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vr6v8_f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c/ovn-controller/0.log" Oct 07 15:17:51 crc kubenswrapper[4717]: I1007 15:17:51.605652 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xgpx2_29dd3c38-62bb-4f7c-9cef-7ab420156b0c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:51 crc kubenswrapper[4717]: I1007 15:17:51.674565 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_57dca108-f9e5-443a-aa97-01263cf96863/openstack-network-exporter/0.log" Oct 07 15:17:51 crc kubenswrapper[4717]: I1007 15:17:51.779165 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_57dca108-f9e5-443a-aa97-01263cf96863/ovn-northd/0.log" Oct 07 15:17:51 crc kubenswrapper[4717]: I1007 15:17:51.916666 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_272dcaf8-29ce-4329-8301-4123eea773dc/openstack-network-exporter/0.log" Oct 07 15:17:52 crc kubenswrapper[4717]: I1007 15:17:52.114526 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d203dc4-3d0b-4e7c-b38b-96f231f12071/openstack-network-exporter/0.log" Oct 07 15:17:52 crc kubenswrapper[4717]: I1007 15:17:52.122705 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_272dcaf8-29ce-4329-8301-4123eea773dc/ovsdbserver-nb/0.log" Oct 07 15:17:52 crc kubenswrapper[4717]: I1007 15:17:52.324895 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d203dc4-3d0b-4e7c-b38b-96f231f12071/ovsdbserver-sb/0.log" Oct 07 15:17:52 crc kubenswrapper[4717]: I1007 15:17:52.761371 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5848b6684d-cddjh_bd72a516-7ee9-4ff2-a7b0-f928a10e676d/placement-api/0.log" Oct 07 15:17:52 crc kubenswrapper[4717]: I1007 15:17:52.823596 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4472de66-ea08-4251-856b-4cb130e7cf1b/setup-container/0.log" Oct 07 15:17:53 crc kubenswrapper[4717]: I1007 15:17:53.081660 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5848b6684d-cddjh_bd72a516-7ee9-4ff2-a7b0-f928a10e676d/placement-log/0.log" Oct 07 15:17:53 crc kubenswrapper[4717]: I1007 15:17:53.108489 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4472de66-ea08-4251-856b-4cb130e7cf1b/rabbitmq/0.log" Oct 07 15:17:53 crc kubenswrapper[4717]: I1007 15:17:53.123072 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4472de66-ea08-4251-856b-4cb130e7cf1b/setup-container/0.log" Oct 07 15:17:53 crc kubenswrapper[4717]: I1007 15:17:53.313178 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d6ac461e-73e2-4268-8a00-6faee58bae2b/setup-container/0.log" Oct 07 15:17:53 crc kubenswrapper[4717]: I1007 15:17:53.566541 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d6ac461e-73e2-4268-8a00-6faee58bae2b/rabbitmq/0.log" Oct 07 15:17:53 crc kubenswrapper[4717]: I1007 15:17:53.588737 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d6ac461e-73e2-4268-8a00-6faee58bae2b/setup-container/0.log" Oct 07 15:17:53 crc kubenswrapper[4717]: I1007 15:17:53.764072 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5_bf9bf972-fbac-4b24-bf35-2cf668fca79d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:53 crc kubenswrapper[4717]: I1007 15:17:53.901995 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-24mzn_394bf866-16b7-4c6a-a729-0a716c1bb5de/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:54 crc kubenswrapper[4717]: I1007 15:17:54.086122 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx_1da9937b-d5ca-4f21-b803-ef9121b48f23/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:54 crc kubenswrapper[4717]: I1007 15:17:54.314310 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9xqjk_c4d5f4c4-7e2a-46b8-8331-3372d6a7e825/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:54 crc kubenswrapper[4717]: I1007 15:17:54.405515 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-d9g97_17107a9c-f715-4bd1-88ac-d79c769fd4e4/ssh-known-hosts-edpm-deployment/0.log" Oct 07 15:17:54 crc kubenswrapper[4717]: I1007 15:17:54.652973 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9896cd659-vvdxn_f44baabf-f1c4-4036-8c6b-ce32cc6cf541/proxy-server/0.log" Oct 07 15:17:54 crc kubenswrapper[4717]: I1007 15:17:54.848307 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9896cd659-vvdxn_f44baabf-f1c4-4036-8c6b-ce32cc6cf541/proxy-httpd/0.log" Oct 07 15:17:54 crc kubenswrapper[4717]: I1007 15:17:54.869907 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:17:54 crc kubenswrapper[4717]: E1007 15:17:54.870215 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:17:54 crc kubenswrapper[4717]: I1007 15:17:54.892858 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-jgbc8_bf613982-6142-4491-931b-ad2e2b2b637f/swift-ring-rebalance/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.093519 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/account-reaper/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.126759 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/account-auditor/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.324998 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/account-replicator/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.369782 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/account-server/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.418526 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/container-auditor/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.593404 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/container-replicator/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.671183 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/container-updater/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.681669 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/container-server/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.807423 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/object-auditor/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.903344 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/object-replicator/0.log" Oct 07 15:17:55 crc kubenswrapper[4717]: I1007 15:17:55.926438 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/object-expirer/0.log" Oct 07 15:17:56 crc kubenswrapper[4717]: I1007 15:17:56.019028 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/object-server/0.log" Oct 07 15:17:56 crc kubenswrapper[4717]: I1007 15:17:56.159993 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/rsync/0.log" Oct 07 15:17:56 crc kubenswrapper[4717]: I1007 15:17:56.164507 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/object-updater/0.log" Oct 07 15:17:56 crc kubenswrapper[4717]: I1007 15:17:56.270598 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/swift-recon-cron/0.log" Oct 07 15:17:56 crc kubenswrapper[4717]: I1007 15:17:56.979870 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7/tempest-tests-tempest-tests-runner/0.log" Oct 07 15:17:56 crc kubenswrapper[4717]: I1007 15:17:56.981756 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj_87e2107a-8eec-497a-b811-7d339dbfe176/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:17:57 crc kubenswrapper[4717]: I1007 15:17:57.012788 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_5f86d81b-0adf-45b3-815e-4ba7029d821d/test-operator-logs-container/0.log" Oct 07 15:17:57 crc kubenswrapper[4717]: I1007 15:17:57.352417 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zthb2_7499d2ef-057a-4267-9725-bb62675d9eb8/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:18:00 crc kubenswrapper[4717]: I1007 15:18:00.865846 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3a693669-0da0-46aa-a110-75593768011d/memcached/0.log" Oct 07 15:18:09 crc kubenswrapper[4717]: I1007 15:18:09.868510 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:18:09 crc kubenswrapper[4717]: E1007 15:18:09.869429 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:18:21 crc kubenswrapper[4717]: I1007 15:18:21.868496 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:18:21 crc kubenswrapper[4717]: E1007 15:18:21.869430 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:18:34 crc kubenswrapper[4717]: I1007 15:18:34.873732 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:18:34 crc kubenswrapper[4717]: E1007 15:18:34.875151 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:18:48 crc kubenswrapper[4717]: I1007 15:18:48.875564 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:18:48 crc kubenswrapper[4717]: E1007 15:18:48.876387 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:19:00 crc kubenswrapper[4717]: I1007 15:19:00.869271 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:19:00 crc kubenswrapper[4717]: E1007 15:19:00.870581 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:19:01 crc kubenswrapper[4717]: I1007 15:19:01.884760 4717 generic.go:334] "Generic (PLEG): container finished" podID="a6cb4061-31ca-45b1-b17a-f55d8f04bb05" containerID="9a45528d3ae9b5371191780f5ba5358779eca6ca1b0714ff4c74c4962e22e495" exitCode=0 Oct 07 15:19:01 crc kubenswrapper[4717]: I1007 15:19:01.885288 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/crc-debug-xmg6f" event={"ID":"a6cb4061-31ca-45b1-b17a-f55d8f04bb05","Type":"ContainerDied","Data":"9a45528d3ae9b5371191780f5ba5358779eca6ca1b0714ff4c74c4962e22e495"} Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.027804 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-xmg6f" Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.065799 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8rbch/crc-debug-xmg6f"] Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.102537 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8m9x\" (UniqueName: \"kubernetes.io/projected/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-kube-api-access-t8m9x\") pod \"a6cb4061-31ca-45b1-b17a-f55d8f04bb05\" (UID: \"a6cb4061-31ca-45b1-b17a-f55d8f04bb05\") " Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.102626 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-host\") pod \"a6cb4061-31ca-45b1-b17a-f55d8f04bb05\" (UID: \"a6cb4061-31ca-45b1-b17a-f55d8f04bb05\") " Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.110213 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-host" (OuterVolumeSpecName: "host") pod "a6cb4061-31ca-45b1-b17a-f55d8f04bb05" (UID: "a6cb4061-31ca-45b1-b17a-f55d8f04bb05"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.119482 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8rbch/crc-debug-xmg6f"] Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.185085 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-kube-api-access-t8m9x" (OuterVolumeSpecName: "kube-api-access-t8m9x") pod "a6cb4061-31ca-45b1-b17a-f55d8f04bb05" (UID: "a6cb4061-31ca-45b1-b17a-f55d8f04bb05"). InnerVolumeSpecName "kube-api-access-t8m9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.209396 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8m9x\" (UniqueName: \"kubernetes.io/projected/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-kube-api-access-t8m9x\") on node \"crc\" DevicePath \"\"" Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.209438 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a6cb4061-31ca-45b1-b17a-f55d8f04bb05-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.908527 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c27282504b02839a6017764e2149ad98cec7a6b415cc23375323b86146027263" Oct 07 15:19:03 crc kubenswrapper[4717]: I1007 15:19:03.908607 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-xmg6f" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.449794 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8rbch/crc-debug-tcd72"] Oct 07 15:19:04 crc kubenswrapper[4717]: E1007 15:19:04.450480 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6cb4061-31ca-45b1-b17a-f55d8f04bb05" containerName="container-00" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.450496 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cb4061-31ca-45b1-b17a-f55d8f04bb05" containerName="container-00" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.450747 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6cb4061-31ca-45b1-b17a-f55d8f04bb05" containerName="container-00" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.451491 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-tcd72" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.534650 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b976659c-23c3-4285-a255-ecb54531ceac-host\") pod \"crc-debug-tcd72\" (UID: \"b976659c-23c3-4285-a255-ecb54531ceac\") " pod="openshift-must-gather-8rbch/crc-debug-tcd72" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.534796 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzbfc\" (UniqueName: \"kubernetes.io/projected/b976659c-23c3-4285-a255-ecb54531ceac-kube-api-access-jzbfc\") pod \"crc-debug-tcd72\" (UID: \"b976659c-23c3-4285-a255-ecb54531ceac\") " pod="openshift-must-gather-8rbch/crc-debug-tcd72" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.637167 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b976659c-23c3-4285-a255-ecb54531ceac-host\") pod \"crc-debug-tcd72\" (UID: \"b976659c-23c3-4285-a255-ecb54531ceac\") " pod="openshift-must-gather-8rbch/crc-debug-tcd72" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.637263 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzbfc\" (UniqueName: \"kubernetes.io/projected/b976659c-23c3-4285-a255-ecb54531ceac-kube-api-access-jzbfc\") pod \"crc-debug-tcd72\" (UID: \"b976659c-23c3-4285-a255-ecb54531ceac\") " pod="openshift-must-gather-8rbch/crc-debug-tcd72" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.637573 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b976659c-23c3-4285-a255-ecb54531ceac-host\") pod \"crc-debug-tcd72\" (UID: \"b976659c-23c3-4285-a255-ecb54531ceac\") " pod="openshift-must-gather-8rbch/crc-debug-tcd72" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.662655 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzbfc\" (UniqueName: \"kubernetes.io/projected/b976659c-23c3-4285-a255-ecb54531ceac-kube-api-access-jzbfc\") pod \"crc-debug-tcd72\" (UID: \"b976659c-23c3-4285-a255-ecb54531ceac\") " pod="openshift-must-gather-8rbch/crc-debug-tcd72" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.768531 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-tcd72" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.880140 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6cb4061-31ca-45b1-b17a-f55d8f04bb05" path="/var/lib/kubelet/pods/a6cb4061-31ca-45b1-b17a-f55d8f04bb05/volumes" Oct 07 15:19:04 crc kubenswrapper[4717]: I1007 15:19:04.921971 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/crc-debug-tcd72" event={"ID":"b976659c-23c3-4285-a255-ecb54531ceac","Type":"ContainerStarted","Data":"4adecff35106f78e3210625c4de18e9c1fe0435fa68221e9b75dc21d2c46370f"} Oct 07 15:19:05 crc kubenswrapper[4717]: I1007 15:19:05.931612 4717 generic.go:334] "Generic (PLEG): container finished" podID="b976659c-23c3-4285-a255-ecb54531ceac" containerID="5e1e4a1df678ae1d12d66b5f9c2c9bd4f8ff9bd0360bb3a682a1bef7fe1dcedf" exitCode=0 Oct 07 15:19:05 crc kubenswrapper[4717]: I1007 15:19:05.931896 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/crc-debug-tcd72" event={"ID":"b976659c-23c3-4285-a255-ecb54531ceac","Type":"ContainerDied","Data":"5e1e4a1df678ae1d12d66b5f9c2c9bd4f8ff9bd0360bb3a682a1bef7fe1dcedf"} Oct 07 15:19:07 crc kubenswrapper[4717]: I1007 15:19:07.071301 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-tcd72" Oct 07 15:19:07 crc kubenswrapper[4717]: I1007 15:19:07.082349 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzbfc\" (UniqueName: \"kubernetes.io/projected/b976659c-23c3-4285-a255-ecb54531ceac-kube-api-access-jzbfc\") pod \"b976659c-23c3-4285-a255-ecb54531ceac\" (UID: \"b976659c-23c3-4285-a255-ecb54531ceac\") " Oct 07 15:19:07 crc kubenswrapper[4717]: I1007 15:19:07.082460 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b976659c-23c3-4285-a255-ecb54531ceac-host\") pod \"b976659c-23c3-4285-a255-ecb54531ceac\" (UID: \"b976659c-23c3-4285-a255-ecb54531ceac\") " Oct 07 15:19:07 crc kubenswrapper[4717]: I1007 15:19:07.082638 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b976659c-23c3-4285-a255-ecb54531ceac-host" (OuterVolumeSpecName: "host") pod "b976659c-23c3-4285-a255-ecb54531ceac" (UID: "b976659c-23c3-4285-a255-ecb54531ceac"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:19:07 crc kubenswrapper[4717]: I1007 15:19:07.083311 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b976659c-23c3-4285-a255-ecb54531ceac-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:19:07 crc kubenswrapper[4717]: I1007 15:19:07.099672 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b976659c-23c3-4285-a255-ecb54531ceac-kube-api-access-jzbfc" (OuterVolumeSpecName: "kube-api-access-jzbfc") pod "b976659c-23c3-4285-a255-ecb54531ceac" (UID: "b976659c-23c3-4285-a255-ecb54531ceac"). InnerVolumeSpecName "kube-api-access-jzbfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:19:07 crc kubenswrapper[4717]: I1007 15:19:07.183970 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzbfc\" (UniqueName: \"kubernetes.io/projected/b976659c-23c3-4285-a255-ecb54531ceac-kube-api-access-jzbfc\") on node \"crc\" DevicePath \"\"" Oct 07 15:19:07 crc kubenswrapper[4717]: I1007 15:19:07.991854 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/crc-debug-tcd72" event={"ID":"b976659c-23c3-4285-a255-ecb54531ceac","Type":"ContainerDied","Data":"4adecff35106f78e3210625c4de18e9c1fe0435fa68221e9b75dc21d2c46370f"} Oct 07 15:19:07 crc kubenswrapper[4717]: I1007 15:19:07.992184 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4adecff35106f78e3210625c4de18e9c1fe0435fa68221e9b75dc21d2c46370f" Oct 07 15:19:07 crc kubenswrapper[4717]: I1007 15:19:07.993784 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-tcd72" Oct 07 15:19:11 crc kubenswrapper[4717]: I1007 15:19:11.868745 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:19:11 crc kubenswrapper[4717]: E1007 15:19:11.870217 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:19:15 crc kubenswrapper[4717]: I1007 15:19:15.568705 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8rbch/crc-debug-tcd72"] Oct 07 15:19:15 crc kubenswrapper[4717]: I1007 15:19:15.577896 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8rbch/crc-debug-tcd72"] Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.724883 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8rbch/crc-debug-n28r8"] Oct 07 15:19:16 crc kubenswrapper[4717]: E1007 15:19:16.725583 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b976659c-23c3-4285-a255-ecb54531ceac" containerName="container-00" Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.725595 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="b976659c-23c3-4285-a255-ecb54531ceac" containerName="container-00" Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.725778 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="b976659c-23c3-4285-a255-ecb54531ceac" containerName="container-00" Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.726584 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-n28r8" Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.856460 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6738049f-c98a-41ab-bbd3-43633b6b86bc-host\") pod \"crc-debug-n28r8\" (UID: \"6738049f-c98a-41ab-bbd3-43633b6b86bc\") " pod="openshift-must-gather-8rbch/crc-debug-n28r8" Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.856545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f849\" (UniqueName: \"kubernetes.io/projected/6738049f-c98a-41ab-bbd3-43633b6b86bc-kube-api-access-8f849\") pod \"crc-debug-n28r8\" (UID: \"6738049f-c98a-41ab-bbd3-43633b6b86bc\") " pod="openshift-must-gather-8rbch/crc-debug-n28r8" Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.884067 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b976659c-23c3-4285-a255-ecb54531ceac" path="/var/lib/kubelet/pods/b976659c-23c3-4285-a255-ecb54531ceac/volumes" Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.958222 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6738049f-c98a-41ab-bbd3-43633b6b86bc-host\") pod \"crc-debug-n28r8\" (UID: \"6738049f-c98a-41ab-bbd3-43633b6b86bc\") " pod="openshift-must-gather-8rbch/crc-debug-n28r8" Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.958293 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f849\" (UniqueName: \"kubernetes.io/projected/6738049f-c98a-41ab-bbd3-43633b6b86bc-kube-api-access-8f849\") pod \"crc-debug-n28r8\" (UID: \"6738049f-c98a-41ab-bbd3-43633b6b86bc\") " pod="openshift-must-gather-8rbch/crc-debug-n28r8" Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.958404 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6738049f-c98a-41ab-bbd3-43633b6b86bc-host\") pod \"crc-debug-n28r8\" (UID: \"6738049f-c98a-41ab-bbd3-43633b6b86bc\") " pod="openshift-must-gather-8rbch/crc-debug-n28r8" Oct 07 15:19:16 crc kubenswrapper[4717]: I1007 15:19:16.980303 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f849\" (UniqueName: \"kubernetes.io/projected/6738049f-c98a-41ab-bbd3-43633b6b86bc-kube-api-access-8f849\") pod \"crc-debug-n28r8\" (UID: \"6738049f-c98a-41ab-bbd3-43633b6b86bc\") " pod="openshift-must-gather-8rbch/crc-debug-n28r8" Oct 07 15:19:17 crc kubenswrapper[4717]: I1007 15:19:17.051887 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-n28r8" Oct 07 15:19:18 crc kubenswrapper[4717]: I1007 15:19:18.099499 4717 generic.go:334] "Generic (PLEG): container finished" podID="6738049f-c98a-41ab-bbd3-43633b6b86bc" containerID="aa4b031183daec0d0a32d67f2e6595a2fbdacaf2d72c3f9075f86474199e3967" exitCode=0 Oct 07 15:19:18 crc kubenswrapper[4717]: I1007 15:19:18.099596 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/crc-debug-n28r8" event={"ID":"6738049f-c98a-41ab-bbd3-43633b6b86bc","Type":"ContainerDied","Data":"aa4b031183daec0d0a32d67f2e6595a2fbdacaf2d72c3f9075f86474199e3967"} Oct 07 15:19:18 crc kubenswrapper[4717]: I1007 15:19:18.100687 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/crc-debug-n28r8" event={"ID":"6738049f-c98a-41ab-bbd3-43633b6b86bc","Type":"ContainerStarted","Data":"5ca8161975f828ef97dbf72661cf8f8244a2e24c941aba99beda5fa2193f4315"} Oct 07 15:19:18 crc kubenswrapper[4717]: I1007 15:19:18.146978 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8rbch/crc-debug-n28r8"] Oct 07 15:19:18 crc kubenswrapper[4717]: I1007 15:19:18.155401 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8rbch/crc-debug-n28r8"] Oct 07 15:19:19 crc kubenswrapper[4717]: I1007 15:19:19.238326 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-n28r8" Oct 07 15:19:19 crc kubenswrapper[4717]: I1007 15:19:19.416607 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6738049f-c98a-41ab-bbd3-43633b6b86bc-host\") pod \"6738049f-c98a-41ab-bbd3-43633b6b86bc\" (UID: \"6738049f-c98a-41ab-bbd3-43633b6b86bc\") " Oct 07 15:19:19 crc kubenswrapper[4717]: I1007 15:19:19.416708 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f849\" (UniqueName: \"kubernetes.io/projected/6738049f-c98a-41ab-bbd3-43633b6b86bc-kube-api-access-8f849\") pod \"6738049f-c98a-41ab-bbd3-43633b6b86bc\" (UID: \"6738049f-c98a-41ab-bbd3-43633b6b86bc\") " Oct 07 15:19:19 crc kubenswrapper[4717]: I1007 15:19:19.416840 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6738049f-c98a-41ab-bbd3-43633b6b86bc-host" (OuterVolumeSpecName: "host") pod "6738049f-c98a-41ab-bbd3-43633b6b86bc" (UID: "6738049f-c98a-41ab-bbd3-43633b6b86bc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:19:19 crc kubenswrapper[4717]: I1007 15:19:19.417908 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6738049f-c98a-41ab-bbd3-43633b6b86bc-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:19:19 crc kubenswrapper[4717]: I1007 15:19:19.423486 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6738049f-c98a-41ab-bbd3-43633b6b86bc-kube-api-access-8f849" (OuterVolumeSpecName: "kube-api-access-8f849") pod "6738049f-c98a-41ab-bbd3-43633b6b86bc" (UID: "6738049f-c98a-41ab-bbd3-43633b6b86bc"). InnerVolumeSpecName "kube-api-access-8f849". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:19:19 crc kubenswrapper[4717]: I1007 15:19:19.520457 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f849\" (UniqueName: \"kubernetes.io/projected/6738049f-c98a-41ab-bbd3-43633b6b86bc-kube-api-access-8f849\") on node \"crc\" DevicePath \"\"" Oct 07 15:19:20 crc kubenswrapper[4717]: I1007 15:19:20.090401 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/util/0.log" Oct 07 15:19:20 crc kubenswrapper[4717]: I1007 15:19:20.121739 4717 scope.go:117] "RemoveContainer" containerID="aa4b031183daec0d0a32d67f2e6595a2fbdacaf2d72c3f9075f86474199e3967" Oct 07 15:19:20 crc kubenswrapper[4717]: I1007 15:19:20.121773 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/crc-debug-n28r8" Oct 07 15:19:20 crc kubenswrapper[4717]: I1007 15:19:20.723689 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/util/0.log" Oct 07 15:19:20 crc kubenswrapper[4717]: I1007 15:19:20.756364 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/pull/0.log" Oct 07 15:19:20 crc kubenswrapper[4717]: I1007 15:19:20.756681 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/pull/0.log" Oct 07 15:19:20 crc kubenswrapper[4717]: I1007 15:19:20.879079 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6738049f-c98a-41ab-bbd3-43633b6b86bc" path="/var/lib/kubelet/pods/6738049f-c98a-41ab-bbd3-43633b6b86bc/volumes" Oct 07 15:19:20 crc kubenswrapper[4717]: I1007 15:19:20.995802 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/pull/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.002157 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/util/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.023042 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/extract/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.180683 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-xd2vv_303da62c-428b-4494-9ad6-4168652dcd4e/kube-rbac-proxy/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.252463 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-sd746_c56f0658-961c-4727-8ba8-5d24af8523dd/kube-rbac-proxy/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.323040 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-xd2vv_303da62c-428b-4494-9ad6-4168652dcd4e/manager/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.467285 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-pfxp7_ee9e55bf-132a-44ef-82ca-0e1ac422afd3/kube-rbac-proxy/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.514058 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-sd746_c56f0658-961c-4727-8ba8-5d24af8523dd/manager/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.555978 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-pfxp7_ee9e55bf-132a-44ef-82ca-0e1ac422afd3/manager/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.761600 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-8xb4k_0aa98a33-f3d0-49ee-8909-a88e320aa26c/kube-rbac-proxy/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.845206 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-8xb4k_0aa98a33-f3d0-49ee-8909-a88e320aa26c/manager/0.log" Oct 07 15:19:21 crc kubenswrapper[4717]: I1007 15:19:21.976913 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-8zlpx_cfd500c0-e80e-4553-affd-c3b5437d67b7/kube-rbac-proxy/0.log" Oct 07 15:19:22 crc kubenswrapper[4717]: I1007 15:19:22.015556 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-8zlpx_cfd500c0-e80e-4553-affd-c3b5437d67b7/manager/0.log" Oct 07 15:19:22 crc kubenswrapper[4717]: I1007 15:19:22.046227 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-q8qkj_acb68432-c913-4c77-bfc8-ef15f9e74a1c/kube-rbac-proxy/0.log" Oct 07 15:19:22 crc kubenswrapper[4717]: I1007 15:19:22.207057 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-q8qkj_acb68432-c913-4c77-bfc8-ef15f9e74a1c/manager/0.log" Oct 07 15:19:22 crc kubenswrapper[4717]: I1007 15:19:22.247439 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-b4l9q_47df3c50-6937-4479-8463-4d6816e354d4/kube-rbac-proxy/0.log" Oct 07 15:19:22 crc kubenswrapper[4717]: I1007 15:19:22.515286 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-jdbmx_b4fabdda-208b-48e9-b4d9-85e638c74ad4/kube-rbac-proxy/0.log" Oct 07 15:19:22 crc kubenswrapper[4717]: I1007 15:19:22.519360 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-jdbmx_b4fabdda-208b-48e9-b4d9-85e638c74ad4/manager/0.log" Oct 07 15:19:22 crc kubenswrapper[4717]: I1007 15:19:22.525402 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-b4l9q_47df3c50-6937-4479-8463-4d6816e354d4/manager/0.log" Oct 07 15:19:22 crc kubenswrapper[4717]: I1007 15:19:22.792401 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-8xkj4_d7327559-8f4c-4745-acde-51a8ec9ca67a/manager/0.log" Oct 07 15:19:22 crc kubenswrapper[4717]: I1007 15:19:22.875021 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-8xkj4_d7327559-8f4c-4745-acde-51a8ec9ca67a/kube-rbac-proxy/0.log" Oct 07 15:19:22 crc kubenswrapper[4717]: I1007 15:19:22.925666 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-7stm5_763bbe43-a00e-4a82-b90c-bd56ab6a516a/kube-rbac-proxy/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.058350 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-7stm5_763bbe43-a00e-4a82-b90c-bd56ab6a516a/manager/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.105365 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-znbv7_d0d37d45-9608-4b1d-97b0-62f8b36ab834/kube-rbac-proxy/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.169224 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-znbv7_d0d37d45-9608-4b1d-97b0-62f8b36ab834/manager/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.301939 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-f7qpd_1edc1259-6db8-4b89-86fa-6410a3d5931d/kube-rbac-proxy/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.438964 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-f7qpd_1edc1259-6db8-4b89-86fa-6410a3d5931d/manager/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.530964 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-28ncp_dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f/kube-rbac-proxy/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.660059 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-28ncp_dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f/manager/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.765437 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-vlk87_af994010-9d35-4fcd-b444-64acb6b65577/kube-rbac-proxy/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.817932 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-vlk87_af994010-9d35-4fcd-b444-64acb6b65577/manager/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.882187 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg_399188c6-e2ce-4c19-93ac-aec1a685d28c/kube-rbac-proxy/0.log" Oct 07 15:19:23 crc kubenswrapper[4717]: I1007 15:19:23.903776 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg_399188c6-e2ce-4c19-93ac-aec1a685d28c/manager/0.log" Oct 07 15:19:24 crc kubenswrapper[4717]: I1007 15:19:24.058573 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-574b968964-27nb9_a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19/kube-rbac-proxy/0.log" Oct 07 15:19:24 crc kubenswrapper[4717]: I1007 15:19:24.146747 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5fc947cc4b-xh5qp_da883109-9f27-447d-aa39-aa7dcf80f19f/kube-rbac-proxy/0.log" Oct 07 15:19:24 crc kubenswrapper[4717]: I1007 15:19:24.416479 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-g6w94_a28c8b75-8908-4152-8e0c-2805e594a4b7/registry-server/0.log" Oct 07 15:19:24 crc kubenswrapper[4717]: I1007 15:19:24.456140 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5fc947cc4b-xh5qp_da883109-9f27-447d-aa39-aa7dcf80f19f/operator/0.log" Oct 07 15:19:24 crc kubenswrapper[4717]: I1007 15:19:24.624168 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-x5psc_9af7f241-0d0a-457e-9332-51d88b1a52d1/kube-rbac-proxy/0.log" Oct 07 15:19:24 crc kubenswrapper[4717]: I1007 15:19:24.814674 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-4dfwm_b2ac5ce4-5123-4d82-aca2-a776d4f89f09/kube-rbac-proxy/0.log" Oct 07 15:19:24 crc kubenswrapper[4717]: I1007 15:19:24.833337 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-x5psc_9af7f241-0d0a-457e-9332-51d88b1a52d1/manager/0.log" Oct 07 15:19:24 crc kubenswrapper[4717]: I1007 15:19:24.974099 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-4dfwm_b2ac5ce4-5123-4d82-aca2-a776d4f89f09/manager/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.115158 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-z65d2_6c49d800-49ef-4cb3-894a-632b519b22a8/operator/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.203119 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-slwnb_21e03a5b-f9f2-4e57-90d3-03edcbb7e2db/kube-rbac-proxy/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.380286 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-slwnb_21e03a5b-f9f2-4e57-90d3-03edcbb7e2db/manager/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.392443 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-bxcrn_c046dbff-c7b3-464b-97c7-ae47f24bcd61/kube-rbac-proxy/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.563763 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-bxcrn_c046dbff-c7b3-464b-97c7-ae47f24bcd61/manager/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.605610 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-574b968964-27nb9_a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19/manager/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.690220 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-xsmnx_1b5c23ef-d224-468b-bc22-2d98de7c4132/kube-rbac-proxy/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.690698 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-xsmnx_1b5c23ef-d224-468b-bc22-2d98de7c4132/manager/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.836796 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-gtf2z_9f385c04-ccd4-4526-ba50-53c7c637a0d3/kube-rbac-proxy/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.849373 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-gtf2z_9f385c04-ccd4-4526-ba50-53c7c637a0d3/manager/0.log" Oct 07 15:19:25 crc kubenswrapper[4717]: I1007 15:19:25.869108 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:19:25 crc kubenswrapper[4717]: E1007 15:19:25.869504 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:19:39 crc kubenswrapper[4717]: I1007 15:19:39.868401 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:19:39 crc kubenswrapper[4717]: E1007 15:19:39.869207 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:19:43 crc kubenswrapper[4717]: I1007 15:19:43.929526 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8qs4m_46171099-e5d7-49a0-8e63-f8b9d3f8b0d8/control-plane-machine-set-operator/0.log" Oct 07 15:19:44 crc kubenswrapper[4717]: I1007 15:19:44.120747 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-79gpc_c3436d60-53c5-4994-88f1-d3aa555e96bd/kube-rbac-proxy/0.log" Oct 07 15:19:44 crc kubenswrapper[4717]: I1007 15:19:44.161608 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-79gpc_c3436d60-53c5-4994-88f1-d3aa555e96bd/machine-api-operator/0.log" Oct 07 15:19:53 crc kubenswrapper[4717]: I1007 15:19:53.869451 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:19:53 crc kubenswrapper[4717]: E1007 15:19:53.870782 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:19:55 crc kubenswrapper[4717]: I1007 15:19:55.687749 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-cjl9f_2865143d-019f-4b4d-950a-c346856cfb7a/cert-manager-controller/0.log" Oct 07 15:19:55 crc kubenswrapper[4717]: I1007 15:19:55.858902 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-kg4r6_3c320f2d-2ff6-4ca8-824a-c866f83684f5/cert-manager-cainjector/0.log" Oct 07 15:19:55 crc kubenswrapper[4717]: I1007 15:19:55.891649 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-z4t9q_a9fab481-f23d-4515-a94d-15fac56b0032/cert-manager-webhook/0.log" Oct 07 15:20:06 crc kubenswrapper[4717]: I1007 15:20:06.870336 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:20:06 crc kubenswrapper[4717]: E1007 15:20:06.871724 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:20:07 crc kubenswrapper[4717]: I1007 15:20:07.815952 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-x6gc5_af496732-9d8f-4872-b431-87ae5dc74691/nmstate-console-plugin/0.log" Oct 07 15:20:08 crc kubenswrapper[4717]: I1007 15:20:08.011610 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5hlst_8083afc6-9711-4408-bd7c-d92138300930/nmstate-handler/0.log" Oct 07 15:20:08 crc kubenswrapper[4717]: I1007 15:20:08.026063 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-rlbn4_df498fdc-f1f6-4fe0-8362-bb061a651a0a/kube-rbac-proxy/0.log" Oct 07 15:20:08 crc kubenswrapper[4717]: I1007 15:20:08.122453 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-rlbn4_df498fdc-f1f6-4fe0-8362-bb061a651a0a/nmstate-metrics/0.log" Oct 07 15:20:08 crc kubenswrapper[4717]: I1007 15:20:08.258925 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-pdn48_840561ca-9862-4a00-af4e-9e870d51efa4/nmstate-operator/0.log" Oct 07 15:20:08 crc kubenswrapper[4717]: I1007 15:20:08.351524 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-wlmfn_f70faa25-3fd3-4e7d-bf27-03a163483cb3/nmstate-webhook/0.log" Oct 07 15:20:20 crc kubenswrapper[4717]: I1007 15:20:20.868797 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:20:20 crc kubenswrapper[4717]: E1007 15:20:20.869542 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:20:21 crc kubenswrapper[4717]: I1007 15:20:21.791663 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rnp2p_9d7d987f-b766-481c-bcd5-ef6ec6e32956/kube-rbac-proxy/0.log" Oct 07 15:20:21 crc kubenswrapper[4717]: I1007 15:20:21.941470 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rnp2p_9d7d987f-b766-481c-bcd5-ef6ec6e32956/controller/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.050609 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-frr-files/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.294068 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-frr-files/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.294692 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-metrics/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.317684 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-reloader/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.317689 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-reloader/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.527269 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-frr-files/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.566839 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-metrics/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.595548 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-metrics/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.605982 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-reloader/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.789558 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-frr-files/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.802375 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-metrics/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.806127 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/controller/0.log" Oct 07 15:20:22 crc kubenswrapper[4717]: I1007 15:20:22.820836 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-reloader/0.log" Oct 07 15:20:23 crc kubenswrapper[4717]: I1007 15:20:23.053772 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/kube-rbac-proxy-frr/0.log" Oct 07 15:20:23 crc kubenswrapper[4717]: I1007 15:20:23.086513 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/kube-rbac-proxy/0.log" Oct 07 15:20:23 crc kubenswrapper[4717]: I1007 15:20:23.116791 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/frr-metrics/0.log" Oct 07 15:20:23 crc kubenswrapper[4717]: I1007 15:20:23.393198 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/reloader/0.log" Oct 07 15:20:23 crc kubenswrapper[4717]: I1007 15:20:23.427994 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-t4gtn_c6ca3b03-14b6-45f2-828e-35d06c8455b9/frr-k8s-webhook-server/0.log" Oct 07 15:20:23 crc kubenswrapper[4717]: I1007 15:20:23.643962 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b67b56d8d-7vj44_2b39e023-1e34-4d26-8001-ce161b5c0dbd/manager/0.log" Oct 07 15:20:23 crc kubenswrapper[4717]: I1007 15:20:23.847430 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-749b845cbd-g6bgz_0d8c8d26-60f0-4cf2-b139-fc06825e1ed4/webhook-server/0.log" Oct 07 15:20:23 crc kubenswrapper[4717]: I1007 15:20:23.949135 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wdrxn_89f792c4-3345-4538-8445-39f6ebfb784c/kube-rbac-proxy/0.log" Oct 07 15:20:24 crc kubenswrapper[4717]: I1007 15:20:24.596362 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wdrxn_89f792c4-3345-4538-8445-39f6ebfb784c/speaker/0.log" Oct 07 15:20:24 crc kubenswrapper[4717]: I1007 15:20:24.715270 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/frr/0.log" Oct 07 15:20:34 crc kubenswrapper[4717]: I1007 15:20:34.868052 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:20:34 crc kubenswrapper[4717]: E1007 15:20:34.870040 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:20:36 crc kubenswrapper[4717]: I1007 15:20:36.539109 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/util/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.099473 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/pull/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.107769 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/pull/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.239342 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/util/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.306388 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/util/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.308944 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/extract/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.325671 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/pull/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.475542 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-utilities/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.675386 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-utilities/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.675461 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-content/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.689374 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-content/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.859821 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-content/0.log" Oct 07 15:20:37 crc kubenswrapper[4717]: I1007 15:20:37.890948 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-utilities/0.log" Oct 07 15:20:38 crc kubenswrapper[4717]: I1007 15:20:38.074077 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/registry-server/0.log" Oct 07 15:20:38 crc kubenswrapper[4717]: I1007 15:20:38.107402 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-utilities/0.log" Oct 07 15:20:38 crc kubenswrapper[4717]: I1007 15:20:38.283610 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-content/0.log" Oct 07 15:20:38 crc kubenswrapper[4717]: I1007 15:20:38.284362 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-content/0.log" Oct 07 15:20:38 crc kubenswrapper[4717]: I1007 15:20:38.287339 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-utilities/0.log" Oct 07 15:20:38 crc kubenswrapper[4717]: I1007 15:20:38.898476 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-utilities/0.log" Oct 07 15:20:38 crc kubenswrapper[4717]: I1007 15:20:38.955923 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-content/0.log" Oct 07 15:20:39 crc kubenswrapper[4717]: I1007 15:20:39.137914 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/util/0.log" Oct 07 15:20:39 crc kubenswrapper[4717]: I1007 15:20:39.413661 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/pull/0.log" Oct 07 15:20:39 crc kubenswrapper[4717]: I1007 15:20:39.418833 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/pull/0.log" Oct 07 15:20:39 crc kubenswrapper[4717]: I1007 15:20:39.521150 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/util/0.log" Oct 07 15:20:39 crc kubenswrapper[4717]: I1007 15:20:39.684354 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/util/0.log" Oct 07 15:20:39 crc kubenswrapper[4717]: I1007 15:20:39.685111 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/pull/0.log" Oct 07 15:20:39 crc kubenswrapper[4717]: I1007 15:20:39.747098 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/extract/0.log" Oct 07 15:20:39 crc kubenswrapper[4717]: I1007 15:20:39.750345 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/registry-server/0.log" Oct 07 15:20:39 crc kubenswrapper[4717]: I1007 15:20:39.874184 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nztkh_56aec528-e156-45c5-ac1a-d55cc129894c/marketplace-operator/0.log" Oct 07 15:20:39 crc kubenswrapper[4717]: I1007 15:20:39.921743 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-utilities/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.090606 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-content/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.110153 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-content/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.115194 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-utilities/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.269079 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-content/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.279805 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-utilities/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.316595 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-utilities/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.519026 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-utilities/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.520305 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-content/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.557691 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/registry-server/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.602622 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-content/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.686470 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-content/0.log" Oct 07 15:20:40 crc kubenswrapper[4717]: I1007 15:20:40.703429 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-utilities/0.log" Oct 07 15:20:41 crc kubenswrapper[4717]: I1007 15:20:41.128788 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/registry-server/0.log" Oct 07 15:20:49 crc kubenswrapper[4717]: I1007 15:20:49.868664 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:20:49 crc kubenswrapper[4717]: E1007 15:20:49.869461 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.009104 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cndpq"] Oct 07 15:20:51 crc kubenswrapper[4717]: E1007 15:20:51.009792 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6738049f-c98a-41ab-bbd3-43633b6b86bc" containerName="container-00" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.009805 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="6738049f-c98a-41ab-bbd3-43633b6b86bc" containerName="container-00" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.010054 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="6738049f-c98a-41ab-bbd3-43633b6b86bc" containerName="container-00" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.011441 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.029410 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cndpq"] Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.174483 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rc25\" (UniqueName: \"kubernetes.io/projected/13c94290-0106-4ae9-b02a-685cc0aa3650-kube-api-access-2rc25\") pod \"redhat-marketplace-cndpq\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.174563 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-utilities\") pod \"redhat-marketplace-cndpq\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.174625 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-catalog-content\") pod \"redhat-marketplace-cndpq\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.276340 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-catalog-content\") pod \"redhat-marketplace-cndpq\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.276554 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rc25\" (UniqueName: \"kubernetes.io/projected/13c94290-0106-4ae9-b02a-685cc0aa3650-kube-api-access-2rc25\") pod \"redhat-marketplace-cndpq\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.276609 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-utilities\") pod \"redhat-marketplace-cndpq\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.277242 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-utilities\") pod \"redhat-marketplace-cndpq\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.277536 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-catalog-content\") pod \"redhat-marketplace-cndpq\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.303141 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rc25\" (UniqueName: \"kubernetes.io/projected/13c94290-0106-4ae9-b02a-685cc0aa3650-kube-api-access-2rc25\") pod \"redhat-marketplace-cndpq\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.339622 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.802112 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cndpq"] Oct 07 15:20:51 crc kubenswrapper[4717]: I1007 15:20:51.965988 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cndpq" event={"ID":"13c94290-0106-4ae9-b02a-685cc0aa3650","Type":"ContainerStarted","Data":"f20cce4381d3dcef35584d9311912ba48caabac6e600bc5697c8d9994403d32d"} Oct 07 15:20:52 crc kubenswrapper[4717]: I1007 15:20:52.976316 4717 generic.go:334] "Generic (PLEG): container finished" podID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerID="f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c" exitCode=0 Oct 07 15:20:52 crc kubenswrapper[4717]: I1007 15:20:52.976364 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cndpq" event={"ID":"13c94290-0106-4ae9-b02a-685cc0aa3650","Type":"ContainerDied","Data":"f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c"} Oct 07 15:20:52 crc kubenswrapper[4717]: I1007 15:20:52.979236 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:20:54 crc kubenswrapper[4717]: I1007 15:20:54.997241 4717 generic.go:334] "Generic (PLEG): container finished" podID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerID="e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5" exitCode=0 Oct 07 15:20:54 crc kubenswrapper[4717]: I1007 15:20:54.997361 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cndpq" event={"ID":"13c94290-0106-4ae9-b02a-685cc0aa3650","Type":"ContainerDied","Data":"e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5"} Oct 07 15:20:56 crc kubenswrapper[4717]: I1007 15:20:56.019112 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cndpq" event={"ID":"13c94290-0106-4ae9-b02a-685cc0aa3650","Type":"ContainerStarted","Data":"3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5"} Oct 07 15:20:56 crc kubenswrapper[4717]: I1007 15:20:56.042996 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cndpq" podStartSLOduration=3.43477465 podStartE2EDuration="6.042977052s" podCreationTimestamp="2025-10-07 15:20:50 +0000 UTC" firstStartedPulling="2025-10-07 15:20:52.978938885 +0000 UTC m=+5234.806864677" lastFinishedPulling="2025-10-07 15:20:55.587141287 +0000 UTC m=+5237.415067079" observedRunningTime="2025-10-07 15:20:56.036674158 +0000 UTC m=+5237.864599960" watchObservedRunningTime="2025-10-07 15:20:56.042977052 +0000 UTC m=+5237.870902844" Oct 07 15:21:01 crc kubenswrapper[4717]: I1007 15:21:01.353964 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:21:01 crc kubenswrapper[4717]: I1007 15:21:01.354473 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:21:01 crc kubenswrapper[4717]: I1007 15:21:01.414739 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:21:02 crc kubenswrapper[4717]: I1007 15:21:02.126930 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:21:02 crc kubenswrapper[4717]: I1007 15:21:02.187671 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cndpq"] Oct 07 15:21:02 crc kubenswrapper[4717]: I1007 15:21:02.870966 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:21:02 crc kubenswrapper[4717]: E1007 15:21:02.871564 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.093232 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cndpq" podUID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerName="registry-server" containerID="cri-o://3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5" gracePeriod=2 Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.670999 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.795236 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rc25\" (UniqueName: \"kubernetes.io/projected/13c94290-0106-4ae9-b02a-685cc0aa3650-kube-api-access-2rc25\") pod \"13c94290-0106-4ae9-b02a-685cc0aa3650\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.795732 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-utilities\") pod \"13c94290-0106-4ae9-b02a-685cc0aa3650\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.795798 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-catalog-content\") pod \"13c94290-0106-4ae9-b02a-685cc0aa3650\" (UID: \"13c94290-0106-4ae9-b02a-685cc0aa3650\") " Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.797431 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-utilities" (OuterVolumeSpecName: "utilities") pod "13c94290-0106-4ae9-b02a-685cc0aa3650" (UID: "13c94290-0106-4ae9-b02a-685cc0aa3650"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.811493 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.812474 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c94290-0106-4ae9-b02a-685cc0aa3650-kube-api-access-2rc25" (OuterVolumeSpecName: "kube-api-access-2rc25") pod "13c94290-0106-4ae9-b02a-685cc0aa3650" (UID: "13c94290-0106-4ae9-b02a-685cc0aa3650"). InnerVolumeSpecName "kube-api-access-2rc25". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.815829 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13c94290-0106-4ae9-b02a-685cc0aa3650" (UID: "13c94290-0106-4ae9-b02a-685cc0aa3650"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.913824 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rc25\" (UniqueName: \"kubernetes.io/projected/13c94290-0106-4ae9-b02a-685cc0aa3650-kube-api-access-2rc25\") on node \"crc\" DevicePath \"\"" Oct 07 15:21:04 crc kubenswrapper[4717]: I1007 15:21:04.913869 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13c94290-0106-4ae9-b02a-685cc0aa3650-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.103452 4717 generic.go:334] "Generic (PLEG): container finished" podID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerID="3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5" exitCode=0 Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.103589 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cndpq" event={"ID":"13c94290-0106-4ae9-b02a-685cc0aa3650","Type":"ContainerDied","Data":"3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5"} Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.104659 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cndpq" event={"ID":"13c94290-0106-4ae9-b02a-685cc0aa3650","Type":"ContainerDied","Data":"f20cce4381d3dcef35584d9311912ba48caabac6e600bc5697c8d9994403d32d"} Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.104734 4717 scope.go:117] "RemoveContainer" containerID="3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5" Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.103673 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cndpq" Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.133695 4717 scope.go:117] "RemoveContainer" containerID="e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5" Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.143363 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cndpq"] Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.154806 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cndpq"] Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.167785 4717 scope.go:117] "RemoveContainer" containerID="f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c" Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.231573 4717 scope.go:117] "RemoveContainer" containerID="3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5" Oct 07 15:21:05 crc kubenswrapper[4717]: E1007 15:21:05.232272 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5\": container with ID starting with 3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5 not found: ID does not exist" containerID="3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5" Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.232363 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5"} err="failed to get container status \"3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5\": rpc error: code = NotFound desc = could not find container \"3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5\": container with ID starting with 3338879e50e109e07920eddf8a644224fbc2862fc680e7d007c825ca3f838ef5 not found: ID does not exist" Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.232452 4717 scope.go:117] "RemoveContainer" containerID="e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5" Oct 07 15:21:05 crc kubenswrapper[4717]: E1007 15:21:05.232890 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5\": container with ID starting with e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5 not found: ID does not exist" containerID="e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5" Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.232977 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5"} err="failed to get container status \"e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5\": rpc error: code = NotFound desc = could not find container \"e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5\": container with ID starting with e06556e78ae89c75004b32b422e1452bd212da6e3b976c6016295c8c695ddad5 not found: ID does not exist" Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.233070 4717 scope.go:117] "RemoveContainer" containerID="f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c" Oct 07 15:21:05 crc kubenswrapper[4717]: E1007 15:21:05.233434 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c\": container with ID starting with f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c not found: ID does not exist" containerID="f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c" Oct 07 15:21:05 crc kubenswrapper[4717]: I1007 15:21:05.233513 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c"} err="failed to get container status \"f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c\": rpc error: code = NotFound desc = could not find container \"f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c\": container with ID starting with f0d845c704379c08428b77db2465fa59206bc15e59920fc993f4e0e1f78ba97c not found: ID does not exist" Oct 07 15:21:06 crc kubenswrapper[4717]: I1007 15:21:06.882141 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c94290-0106-4ae9-b02a-685cc0aa3650" path="/var/lib/kubelet/pods/13c94290-0106-4ae9-b02a-685cc0aa3650/volumes" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.533145 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vmcsl"] Oct 07 15:21:10 crc kubenswrapper[4717]: E1007 15:21:10.534098 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerName="registry-server" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.534111 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerName="registry-server" Oct 07 15:21:10 crc kubenswrapper[4717]: E1007 15:21:10.534134 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerName="extract-content" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.534141 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerName="extract-content" Oct 07 15:21:10 crc kubenswrapper[4717]: E1007 15:21:10.534168 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerName="extract-utilities" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.534174 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerName="extract-utilities" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.534349 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c94290-0106-4ae9-b02a-685cc0aa3650" containerName="registry-server" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.535738 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.553695 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vmcsl"] Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.641258 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-utilities\") pod \"certified-operators-vmcsl\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.641371 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r6nq\" (UniqueName: \"kubernetes.io/projected/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-kube-api-access-4r6nq\") pod \"certified-operators-vmcsl\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.641519 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-catalog-content\") pod \"certified-operators-vmcsl\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.743161 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r6nq\" (UniqueName: \"kubernetes.io/projected/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-kube-api-access-4r6nq\") pod \"certified-operators-vmcsl\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.743300 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-catalog-content\") pod \"certified-operators-vmcsl\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.743362 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-utilities\") pod \"certified-operators-vmcsl\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.743845 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-utilities\") pod \"certified-operators-vmcsl\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.744330 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-catalog-content\") pod \"certified-operators-vmcsl\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.768901 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r6nq\" (UniqueName: \"kubernetes.io/projected/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-kube-api-access-4r6nq\") pod \"certified-operators-vmcsl\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:10 crc kubenswrapper[4717]: I1007 15:21:10.861526 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:11 crc kubenswrapper[4717]: I1007 15:21:11.609664 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vmcsl"] Oct 07 15:21:12 crc kubenswrapper[4717]: I1007 15:21:12.172359 4717 generic.go:334] "Generic (PLEG): container finished" podID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerID="99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c" exitCode=0 Oct 07 15:21:12 crc kubenswrapper[4717]: I1007 15:21:12.172430 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmcsl" event={"ID":"48a770bc-eb6a-45c5-890a-2b7818c0d2dc","Type":"ContainerDied","Data":"99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c"} Oct 07 15:21:12 crc kubenswrapper[4717]: I1007 15:21:12.172902 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmcsl" event={"ID":"48a770bc-eb6a-45c5-890a-2b7818c0d2dc","Type":"ContainerStarted","Data":"58d5219452479fca83fea13822707eb79f5a5f01f34e0535881927085a3da1a3"} Oct 07 15:21:13 crc kubenswrapper[4717]: I1007 15:21:13.182509 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmcsl" event={"ID":"48a770bc-eb6a-45c5-890a-2b7818c0d2dc","Type":"ContainerStarted","Data":"33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf"} Oct 07 15:21:14 crc kubenswrapper[4717]: I1007 15:21:14.195560 4717 generic.go:334] "Generic (PLEG): container finished" podID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerID="33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf" exitCode=0 Oct 07 15:21:14 crc kubenswrapper[4717]: I1007 15:21:14.195639 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmcsl" event={"ID":"48a770bc-eb6a-45c5-890a-2b7818c0d2dc","Type":"ContainerDied","Data":"33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf"} Oct 07 15:21:14 crc kubenswrapper[4717]: I1007 15:21:14.870387 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:21:14 crc kubenswrapper[4717]: E1007 15:21:14.872333 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:21:16 crc kubenswrapper[4717]: I1007 15:21:16.225369 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmcsl" event={"ID":"48a770bc-eb6a-45c5-890a-2b7818c0d2dc","Type":"ContainerStarted","Data":"785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0"} Oct 07 15:21:16 crc kubenswrapper[4717]: I1007 15:21:16.253903 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vmcsl" podStartSLOduration=3.457179863 podStartE2EDuration="6.253883194s" podCreationTimestamp="2025-10-07 15:21:10 +0000 UTC" firstStartedPulling="2025-10-07 15:21:12.174229992 +0000 UTC m=+5254.002155784" lastFinishedPulling="2025-10-07 15:21:14.970933323 +0000 UTC m=+5256.798859115" observedRunningTime="2025-10-07 15:21:16.247839248 +0000 UTC m=+5258.075765040" watchObservedRunningTime="2025-10-07 15:21:16.253883194 +0000 UTC m=+5258.081808986" Oct 07 15:21:20 crc kubenswrapper[4717]: I1007 15:21:20.861797 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:20 crc kubenswrapper[4717]: I1007 15:21:20.862967 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:20 crc kubenswrapper[4717]: I1007 15:21:20.919729 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:21 crc kubenswrapper[4717]: I1007 15:21:21.316233 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:22 crc kubenswrapper[4717]: I1007 15:21:22.166607 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vmcsl"] Oct 07 15:21:23 crc kubenswrapper[4717]: I1007 15:21:23.287063 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vmcsl" podUID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerName="registry-server" containerID="cri-o://785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0" gracePeriod=2 Oct 07 15:21:23 crc kubenswrapper[4717]: I1007 15:21:23.851169 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:23 crc kubenswrapper[4717]: I1007 15:21:23.924861 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r6nq\" (UniqueName: \"kubernetes.io/projected/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-kube-api-access-4r6nq\") pod \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " Oct 07 15:21:23 crc kubenswrapper[4717]: I1007 15:21:23.924992 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-utilities\") pod \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " Oct 07 15:21:23 crc kubenswrapper[4717]: I1007 15:21:23.925156 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-catalog-content\") pod \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " Oct 07 15:21:23 crc kubenswrapper[4717]: I1007 15:21:23.926208 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-utilities" (OuterVolumeSpecName: "utilities") pod "48a770bc-eb6a-45c5-890a-2b7818c0d2dc" (UID: "48a770bc-eb6a-45c5-890a-2b7818c0d2dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:21:23 crc kubenswrapper[4717]: I1007 15:21:23.936338 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-kube-api-access-4r6nq" (OuterVolumeSpecName: "kube-api-access-4r6nq") pod "48a770bc-eb6a-45c5-890a-2b7818c0d2dc" (UID: "48a770bc-eb6a-45c5-890a-2b7818c0d2dc"). InnerVolumeSpecName "kube-api-access-4r6nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.026723 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r6nq\" (UniqueName: \"kubernetes.io/projected/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-kube-api-access-4r6nq\") on node \"crc\" DevicePath \"\"" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.026756 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.302490 4717 generic.go:334] "Generic (PLEG): container finished" podID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerID="785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0" exitCode=0 Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.302588 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmcsl" event={"ID":"48a770bc-eb6a-45c5-890a-2b7818c0d2dc","Type":"ContainerDied","Data":"785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0"} Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.302944 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmcsl" event={"ID":"48a770bc-eb6a-45c5-890a-2b7818c0d2dc","Type":"ContainerDied","Data":"58d5219452479fca83fea13822707eb79f5a5f01f34e0535881927085a3da1a3"} Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.302979 4717 scope.go:117] "RemoveContainer" containerID="785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.302602 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmcsl" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.325979 4717 scope.go:117] "RemoveContainer" containerID="33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.333855 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48a770bc-eb6a-45c5-890a-2b7818c0d2dc" (UID: "48a770bc-eb6a-45c5-890a-2b7818c0d2dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.339064 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-catalog-content\") pod \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\" (UID: \"48a770bc-eb6a-45c5-890a-2b7818c0d2dc\") " Oct 07 15:21:24 crc kubenswrapper[4717]: W1007 15:21:24.340057 4717 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/48a770bc-eb6a-45c5-890a-2b7818c0d2dc/volumes/kubernetes.io~empty-dir/catalog-content Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.340079 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48a770bc-eb6a-45c5-890a-2b7818c0d2dc" (UID: "48a770bc-eb6a-45c5-890a-2b7818c0d2dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.350222 4717 scope.go:117] "RemoveContainer" containerID="99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.420989 4717 scope.go:117] "RemoveContainer" containerID="785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0" Oct 07 15:21:24 crc kubenswrapper[4717]: E1007 15:21:24.421605 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0\": container with ID starting with 785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0 not found: ID does not exist" containerID="785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.421668 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0"} err="failed to get container status \"785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0\": rpc error: code = NotFound desc = could not find container \"785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0\": container with ID starting with 785b267d990c88fa68e05da97b739ded21c36955d6e85b55ab6ebec6551b49a0 not found: ID does not exist" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.421705 4717 scope.go:117] "RemoveContainer" containerID="33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf" Oct 07 15:21:24 crc kubenswrapper[4717]: E1007 15:21:24.422300 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf\": container with ID starting with 33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf not found: ID does not exist" containerID="33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.422352 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf"} err="failed to get container status \"33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf\": rpc error: code = NotFound desc = could not find container \"33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf\": container with ID starting with 33c11fda9878c0f9d0d58e72ac540f02a07d8ac8ef2ed8db48c460fb660867bf not found: ID does not exist" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.422387 4717 scope.go:117] "RemoveContainer" containerID="99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c" Oct 07 15:21:24 crc kubenswrapper[4717]: E1007 15:21:24.423105 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c\": container with ID starting with 99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c not found: ID does not exist" containerID="99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.423154 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c"} err="failed to get container status \"99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c\": rpc error: code = NotFound desc = could not find container \"99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c\": container with ID starting with 99dc722a2c6e0af4abb1ae3721f4a349bee426b13eebb18f69d2432b55dea56c not found: ID does not exist" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.441873 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48a770bc-eb6a-45c5-890a-2b7818c0d2dc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.652557 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vmcsl"] Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.663079 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vmcsl"] Oct 07 15:21:24 crc kubenswrapper[4717]: I1007 15:21:24.883050 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" path="/var/lib/kubelet/pods/48a770bc-eb6a-45c5-890a-2b7818c0d2dc/volumes" Oct 07 15:21:28 crc kubenswrapper[4717]: I1007 15:21:28.876843 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:21:28 crc kubenswrapper[4717]: E1007 15:21:28.878365 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.433598 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8gh87"] Oct 07 15:21:43 crc kubenswrapper[4717]: E1007 15:21:43.434689 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerName="extract-utilities" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.434704 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerName="extract-utilities" Oct 07 15:21:43 crc kubenswrapper[4717]: E1007 15:21:43.434722 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerName="registry-server" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.434730 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerName="registry-server" Oct 07 15:21:43 crc kubenswrapper[4717]: E1007 15:21:43.434750 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerName="extract-content" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.434757 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerName="extract-content" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.435023 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="48a770bc-eb6a-45c5-890a-2b7818c0d2dc" containerName="registry-server" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.437222 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.459043 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8gh87"] Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.483257 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w5p7\" (UniqueName: \"kubernetes.io/projected/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-kube-api-access-7w5p7\") pod \"redhat-operators-8gh87\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.483350 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-utilities\") pod \"redhat-operators-8gh87\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.483427 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-catalog-content\") pod \"redhat-operators-8gh87\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.585582 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-utilities\") pod \"redhat-operators-8gh87\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.585708 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-catalog-content\") pod \"redhat-operators-8gh87\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.586173 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-utilities\") pod \"redhat-operators-8gh87\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.586214 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-catalog-content\") pod \"redhat-operators-8gh87\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.586386 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w5p7\" (UniqueName: \"kubernetes.io/projected/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-kube-api-access-7w5p7\") pod \"redhat-operators-8gh87\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.605794 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w5p7\" (UniqueName: \"kubernetes.io/projected/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-kube-api-access-7w5p7\") pod \"redhat-operators-8gh87\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.763262 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:21:43 crc kubenswrapper[4717]: I1007 15:21:43.869133 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:21:43 crc kubenswrapper[4717]: E1007 15:21:43.869424 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:21:44 crc kubenswrapper[4717]: I1007 15:21:44.375565 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8gh87"] Oct 07 15:21:44 crc kubenswrapper[4717]: I1007 15:21:44.522409 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gh87" event={"ID":"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1","Type":"ContainerStarted","Data":"481f80bab08ece056fa4701561f616dae631d9745a1010543b92d9d44a87dbb5"} Oct 07 15:21:45 crc kubenswrapper[4717]: I1007 15:21:45.533372 4717 generic.go:334] "Generic (PLEG): container finished" podID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerID="585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1" exitCode=0 Oct 07 15:21:45 crc kubenswrapper[4717]: I1007 15:21:45.533724 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gh87" event={"ID":"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1","Type":"ContainerDied","Data":"585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1"} Oct 07 15:21:47 crc kubenswrapper[4717]: I1007 15:21:47.560761 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gh87" event={"ID":"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1","Type":"ContainerStarted","Data":"46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f"} Oct 07 15:21:51 crc kubenswrapper[4717]: I1007 15:21:51.632023 4717 generic.go:334] "Generic (PLEG): container finished" podID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerID="46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f" exitCode=0 Oct 07 15:21:51 crc kubenswrapper[4717]: I1007 15:21:51.632571 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gh87" event={"ID":"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1","Type":"ContainerDied","Data":"46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f"} Oct 07 15:21:55 crc kubenswrapper[4717]: I1007 15:21:55.674300 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gh87" event={"ID":"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1","Type":"ContainerStarted","Data":"0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa"} Oct 07 15:21:55 crc kubenswrapper[4717]: I1007 15:21:55.697164 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8gh87" podStartSLOduration=4.90812627 podStartE2EDuration="12.697143339s" podCreationTimestamp="2025-10-07 15:21:43 +0000 UTC" firstStartedPulling="2025-10-07 15:21:45.536646531 +0000 UTC m=+5287.364572323" lastFinishedPulling="2025-10-07 15:21:53.3256636 +0000 UTC m=+5295.153589392" observedRunningTime="2025-10-07 15:21:55.691285418 +0000 UTC m=+5297.519211210" watchObservedRunningTime="2025-10-07 15:21:55.697143339 +0000 UTC m=+5297.525069121" Oct 07 15:21:56 crc kubenswrapper[4717]: I1007 15:21:56.869217 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:21:56 crc kubenswrapper[4717]: E1007 15:21:56.869825 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:22:03 crc kubenswrapper[4717]: I1007 15:22:03.763814 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:22:03 crc kubenswrapper[4717]: I1007 15:22:03.764387 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:22:03 crc kubenswrapper[4717]: I1007 15:22:03.807367 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:22:04 crc kubenswrapper[4717]: I1007 15:22:04.811454 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:22:04 crc kubenswrapper[4717]: I1007 15:22:04.862382 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8gh87"] Oct 07 15:22:06 crc kubenswrapper[4717]: I1007 15:22:06.785454 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8gh87" podUID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerName="registry-server" containerID="cri-o://0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa" gracePeriod=2 Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.380232 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.471118 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-utilities\") pod \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.471198 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-catalog-content\") pod \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.471251 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w5p7\" (UniqueName: \"kubernetes.io/projected/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-kube-api-access-7w5p7\") pod \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\" (UID: \"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1\") " Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.473023 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-utilities" (OuterVolumeSpecName: "utilities") pod "1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" (UID: "1ca08193-5f5b-45c5-a9e0-3343ccb4ede1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.473656 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.476658 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-kube-api-access-7w5p7" (OuterVolumeSpecName: "kube-api-access-7w5p7") pod "1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" (UID: "1ca08193-5f5b-45c5-a9e0-3343ccb4ede1"). InnerVolumeSpecName "kube-api-access-7w5p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.563989 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" (UID: "1ca08193-5f5b-45c5-a9e0-3343ccb4ede1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.576061 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.576113 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w5p7\" (UniqueName: \"kubernetes.io/projected/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1-kube-api-access-7w5p7\") on node \"crc\" DevicePath \"\"" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.801817 4717 generic.go:334] "Generic (PLEG): container finished" podID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerID="0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa" exitCode=0 Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.801873 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8gh87" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.801875 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gh87" event={"ID":"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1","Type":"ContainerDied","Data":"0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa"} Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.801950 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8gh87" event={"ID":"1ca08193-5f5b-45c5-a9e0-3343ccb4ede1","Type":"ContainerDied","Data":"481f80bab08ece056fa4701561f616dae631d9745a1010543b92d9d44a87dbb5"} Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.801967 4717 scope.go:117] "RemoveContainer" containerID="0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.850362 4717 scope.go:117] "RemoveContainer" containerID="46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.853638 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8gh87"] Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.865562 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8gh87"] Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.874733 4717 scope.go:117] "RemoveContainer" containerID="585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.917614 4717 scope.go:117] "RemoveContainer" containerID="0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa" Oct 07 15:22:07 crc kubenswrapper[4717]: E1007 15:22:07.918106 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa\": container with ID starting with 0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa not found: ID does not exist" containerID="0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.918147 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa"} err="failed to get container status \"0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa\": rpc error: code = NotFound desc = could not find container \"0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa\": container with ID starting with 0795f3433f899d8fd0da567fa5557c304f4252c1557023aada9b87541696acaa not found: ID does not exist" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.918176 4717 scope.go:117] "RemoveContainer" containerID="46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f" Oct 07 15:22:07 crc kubenswrapper[4717]: E1007 15:22:07.918507 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f\": container with ID starting with 46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f not found: ID does not exist" containerID="46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.918538 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f"} err="failed to get container status \"46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f\": rpc error: code = NotFound desc = could not find container \"46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f\": container with ID starting with 46f5ad21fc9e7637c94aa78d94fad62681ff845d50b5cf175032b8adcf0b9b9f not found: ID does not exist" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.918563 4717 scope.go:117] "RemoveContainer" containerID="585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1" Oct 07 15:22:07 crc kubenswrapper[4717]: E1007 15:22:07.918802 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1\": container with ID starting with 585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1 not found: ID does not exist" containerID="585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1" Oct 07 15:22:07 crc kubenswrapper[4717]: I1007 15:22:07.918831 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1"} err="failed to get container status \"585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1\": rpc error: code = NotFound desc = could not find container \"585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1\": container with ID starting with 585b4e45535f743cfe1b23897d76c7f2ce7f3af5a6bc0016f072a697a48f2aa1 not found: ID does not exist" Oct 07 15:22:08 crc kubenswrapper[4717]: I1007 15:22:08.876211 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:22:08 crc kubenswrapper[4717]: I1007 15:22:08.894847 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" path="/var/lib/kubelet/pods/1ca08193-5f5b-45c5-a9e0-3343ccb4ede1/volumes" Oct 07 15:22:09 crc kubenswrapper[4717]: I1007 15:22:09.825746 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"41f8855bc62d051328978794adb7bc50b38200d7a849c0f95f4da2ee5cc17e66"} Oct 07 15:22:47 crc kubenswrapper[4717]: I1007 15:22:47.008923 4717 scope.go:117] "RemoveContainer" containerID="9a45528d3ae9b5371191780f5ba5358779eca6ca1b0714ff4c74c4962e22e495" Oct 07 15:23:20 crc kubenswrapper[4717]: I1007 15:23:20.506748 4717 generic.go:334] "Generic (PLEG): container finished" podID="d7e9085d-a33d-4850-9c89-0f29ccddf977" containerID="9ed70b47cd1d10f73babbd4a0bdc5cb59d1328ef1d0fdee8da8e5326dda5a594" exitCode=0 Oct 07 15:23:20 crc kubenswrapper[4717]: I1007 15:23:20.506858 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rbch/must-gather-n7ldt" event={"ID":"d7e9085d-a33d-4850-9c89-0f29ccddf977","Type":"ContainerDied","Data":"9ed70b47cd1d10f73babbd4a0bdc5cb59d1328ef1d0fdee8da8e5326dda5a594"} Oct 07 15:23:20 crc kubenswrapper[4717]: I1007 15:23:20.507952 4717 scope.go:117] "RemoveContainer" containerID="9ed70b47cd1d10f73babbd4a0bdc5cb59d1328ef1d0fdee8da8e5326dda5a594" Oct 07 15:23:20 crc kubenswrapper[4717]: I1007 15:23:20.944646 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8rbch_must-gather-n7ldt_d7e9085d-a33d-4850-9c89-0f29ccddf977/gather/0.log" Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.123787 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8rbch/must-gather-n7ldt"] Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.124609 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-8rbch/must-gather-n7ldt" podUID="d7e9085d-a33d-4850-9c89-0f29ccddf977" containerName="copy" containerID="cri-o://af38d3008dbfed0d2d7ad43f2727a0aab6bb34072ea0489482833035092bdb25" gracePeriod=2 Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.132751 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8rbch/must-gather-n7ldt"] Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.604060 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8rbch_must-gather-n7ldt_d7e9085d-a33d-4850-9c89-0f29ccddf977/copy/0.log" Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.605249 4717 generic.go:334] "Generic (PLEG): container finished" podID="d7e9085d-a33d-4850-9c89-0f29ccddf977" containerID="af38d3008dbfed0d2d7ad43f2727a0aab6bb34072ea0489482833035092bdb25" exitCode=143 Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.605308 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="106e678eafad10c80f158d128906f3005d87758964f7bc24d4f40f102bfcd9f2" Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.611150 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8rbch_must-gather-n7ldt_d7e9085d-a33d-4850-9c89-0f29ccddf977/copy/0.log" Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.611809 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/must-gather-n7ldt" Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.785396 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjlqp\" (UniqueName: \"kubernetes.io/projected/d7e9085d-a33d-4850-9c89-0f29ccddf977-kube-api-access-mjlqp\") pod \"d7e9085d-a33d-4850-9c89-0f29ccddf977\" (UID: \"d7e9085d-a33d-4850-9c89-0f29ccddf977\") " Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.785900 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d7e9085d-a33d-4850-9c89-0f29ccddf977-must-gather-output\") pod \"d7e9085d-a33d-4850-9c89-0f29ccddf977\" (UID: \"d7e9085d-a33d-4850-9c89-0f29ccddf977\") " Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.791553 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e9085d-a33d-4850-9c89-0f29ccddf977-kube-api-access-mjlqp" (OuterVolumeSpecName: "kube-api-access-mjlqp") pod "d7e9085d-a33d-4850-9c89-0f29ccddf977" (UID: "d7e9085d-a33d-4850-9c89-0f29ccddf977"). InnerVolumeSpecName "kube-api-access-mjlqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.888791 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjlqp\" (UniqueName: \"kubernetes.io/projected/d7e9085d-a33d-4850-9c89-0f29ccddf977-kube-api-access-mjlqp\") on node \"crc\" DevicePath \"\"" Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.986522 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7e9085d-a33d-4850-9c89-0f29ccddf977-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d7e9085d-a33d-4850-9c89-0f29ccddf977" (UID: "d7e9085d-a33d-4850-9c89-0f29ccddf977"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:23:30 crc kubenswrapper[4717]: I1007 15:23:30.990578 4717 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d7e9085d-a33d-4850-9c89-0f29ccddf977-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 15:23:31 crc kubenswrapper[4717]: I1007 15:23:31.613525 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rbch/must-gather-n7ldt" Oct 07 15:23:32 crc kubenswrapper[4717]: I1007 15:23:32.880465 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e9085d-a33d-4850-9c89-0f29ccddf977" path="/var/lib/kubelet/pods/d7e9085d-a33d-4850-9c89-0f29ccddf977/volumes" Oct 07 15:23:47 crc kubenswrapper[4717]: I1007 15:23:47.101852 4717 scope.go:117] "RemoveContainer" containerID="9ed70b47cd1d10f73babbd4a0bdc5cb59d1328ef1d0fdee8da8e5326dda5a594" Oct 07 15:23:47 crc kubenswrapper[4717]: I1007 15:23:47.188168 4717 scope.go:117] "RemoveContainer" containerID="af38d3008dbfed0d2d7ad43f2727a0aab6bb34072ea0489482833035092bdb25" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.756739 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vznr9"] Oct 07 15:23:55 crc kubenswrapper[4717]: E1007 15:23:55.757778 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerName="extract-content" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.757797 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerName="extract-content" Oct 07 15:23:55 crc kubenswrapper[4717]: E1007 15:23:55.757816 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerName="registry-server" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.757826 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerName="registry-server" Oct 07 15:23:55 crc kubenswrapper[4717]: E1007 15:23:55.757885 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e9085d-a33d-4850-9c89-0f29ccddf977" containerName="copy" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.757894 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e9085d-a33d-4850-9c89-0f29ccddf977" containerName="copy" Oct 07 15:23:55 crc kubenswrapper[4717]: E1007 15:23:55.757922 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e9085d-a33d-4850-9c89-0f29ccddf977" containerName="gather" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.757930 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e9085d-a33d-4850-9c89-0f29ccddf977" containerName="gather" Oct 07 15:23:55 crc kubenswrapper[4717]: E1007 15:23:55.757958 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerName="extract-utilities" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.757966 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerName="extract-utilities" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.758534 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e9085d-a33d-4850-9c89-0f29ccddf977" containerName="copy" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.758555 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca08193-5f5b-45c5-a9e0-3343ccb4ede1" containerName="registry-server" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.758572 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e9085d-a33d-4850-9c89-0f29ccddf977" containerName="gather" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.760456 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.781830 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vznr9"] Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.803905 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-utilities\") pod \"community-operators-vznr9\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.803986 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-catalog-content\") pod \"community-operators-vznr9\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.804064 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqmbr\" (UniqueName: \"kubernetes.io/projected/f8638656-9d65-4c13-94b7-e18966924d82-kube-api-access-cqmbr\") pod \"community-operators-vznr9\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.906425 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-utilities\") pod \"community-operators-vznr9\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.906945 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-utilities\") pod \"community-operators-vznr9\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.907327 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-catalog-content\") pod \"community-operators-vznr9\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.907377 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqmbr\" (UniqueName: \"kubernetes.io/projected/f8638656-9d65-4c13-94b7-e18966924d82-kube-api-access-cqmbr\") pod \"community-operators-vznr9\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.907687 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-catalog-content\") pod \"community-operators-vznr9\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:55 crc kubenswrapper[4717]: I1007 15:23:55.933070 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqmbr\" (UniqueName: \"kubernetes.io/projected/f8638656-9d65-4c13-94b7-e18966924d82-kube-api-access-cqmbr\") pod \"community-operators-vznr9\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:56 crc kubenswrapper[4717]: I1007 15:23:56.082470 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:23:56 crc kubenswrapper[4717]: I1007 15:23:56.609476 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vznr9"] Oct 07 15:23:56 crc kubenswrapper[4717]: W1007 15:23:56.615276 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8638656_9d65_4c13_94b7_e18966924d82.slice/crio-2f3cd4889f439b33938ad105ded68fd2e2cda12dac4e2a2768d3b5772ff79b8c WatchSource:0}: Error finding container 2f3cd4889f439b33938ad105ded68fd2e2cda12dac4e2a2768d3b5772ff79b8c: Status 404 returned error can't find the container with id 2f3cd4889f439b33938ad105ded68fd2e2cda12dac4e2a2768d3b5772ff79b8c Oct 07 15:23:56 crc kubenswrapper[4717]: I1007 15:23:56.846901 4717 generic.go:334] "Generic (PLEG): container finished" podID="f8638656-9d65-4c13-94b7-e18966924d82" containerID="c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02" exitCode=0 Oct 07 15:23:56 crc kubenswrapper[4717]: I1007 15:23:56.846984 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vznr9" event={"ID":"f8638656-9d65-4c13-94b7-e18966924d82","Type":"ContainerDied","Data":"c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02"} Oct 07 15:23:56 crc kubenswrapper[4717]: I1007 15:23:56.847045 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vznr9" event={"ID":"f8638656-9d65-4c13-94b7-e18966924d82","Type":"ContainerStarted","Data":"2f3cd4889f439b33938ad105ded68fd2e2cda12dac4e2a2768d3b5772ff79b8c"} Oct 07 15:23:58 crc kubenswrapper[4717]: I1007 15:23:58.892150 4717 generic.go:334] "Generic (PLEG): container finished" podID="f8638656-9d65-4c13-94b7-e18966924d82" containerID="984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313" exitCode=0 Oct 07 15:23:58 crc kubenswrapper[4717]: I1007 15:23:58.892687 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vznr9" event={"ID":"f8638656-9d65-4c13-94b7-e18966924d82","Type":"ContainerDied","Data":"984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313"} Oct 07 15:23:59 crc kubenswrapper[4717]: I1007 15:23:59.904140 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vznr9" event={"ID":"f8638656-9d65-4c13-94b7-e18966924d82","Type":"ContainerStarted","Data":"bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5"} Oct 07 15:23:59 crc kubenswrapper[4717]: I1007 15:23:59.930091 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vznr9" podStartSLOduration=2.479660998 podStartE2EDuration="4.930067885s" podCreationTimestamp="2025-10-07 15:23:55 +0000 UTC" firstStartedPulling="2025-10-07 15:23:56.848780444 +0000 UTC m=+5418.676706226" lastFinishedPulling="2025-10-07 15:23:59.299187321 +0000 UTC m=+5421.127113113" observedRunningTime="2025-10-07 15:23:59.920584144 +0000 UTC m=+5421.748509956" watchObservedRunningTime="2025-10-07 15:23:59.930067885 +0000 UTC m=+5421.757993677" Oct 07 15:24:06 crc kubenswrapper[4717]: I1007 15:24:06.082973 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:24:06 crc kubenswrapper[4717]: I1007 15:24:06.083596 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:24:06 crc kubenswrapper[4717]: I1007 15:24:06.129488 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:24:07 crc kubenswrapper[4717]: I1007 15:24:07.015327 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:24:07 crc kubenswrapper[4717]: I1007 15:24:07.128550 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vznr9"] Oct 07 15:24:08 crc kubenswrapper[4717]: I1007 15:24:08.981075 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vznr9" podUID="f8638656-9d65-4c13-94b7-e18966924d82" containerName="registry-server" containerID="cri-o://bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5" gracePeriod=2 Oct 07 15:24:09 crc kubenswrapper[4717]: I1007 15:24:09.964608 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.004459 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqmbr\" (UniqueName: \"kubernetes.io/projected/f8638656-9d65-4c13-94b7-e18966924d82-kube-api-access-cqmbr\") pod \"f8638656-9d65-4c13-94b7-e18966924d82\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.004888 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-catalog-content\") pod \"f8638656-9d65-4c13-94b7-e18966924d82\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.005140 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-utilities\") pod \"f8638656-9d65-4c13-94b7-e18966924d82\" (UID: \"f8638656-9d65-4c13-94b7-e18966924d82\") " Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.006863 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-utilities" (OuterVolumeSpecName: "utilities") pod "f8638656-9d65-4c13-94b7-e18966924d82" (UID: "f8638656-9d65-4c13-94b7-e18966924d82"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.026426 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8638656-9d65-4c13-94b7-e18966924d82-kube-api-access-cqmbr" (OuterVolumeSpecName: "kube-api-access-cqmbr") pod "f8638656-9d65-4c13-94b7-e18966924d82" (UID: "f8638656-9d65-4c13-94b7-e18966924d82"). InnerVolumeSpecName "kube-api-access-cqmbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.028065 4717 generic.go:334] "Generic (PLEG): container finished" podID="f8638656-9d65-4c13-94b7-e18966924d82" containerID="bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5" exitCode=0 Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.028111 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vznr9" event={"ID":"f8638656-9d65-4c13-94b7-e18966924d82","Type":"ContainerDied","Data":"bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5"} Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.028139 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vznr9" event={"ID":"f8638656-9d65-4c13-94b7-e18966924d82","Type":"ContainerDied","Data":"2f3cd4889f439b33938ad105ded68fd2e2cda12dac4e2a2768d3b5772ff79b8c"} Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.028156 4717 scope.go:117] "RemoveContainer" containerID="bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.028322 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vznr9" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.065417 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8638656-9d65-4c13-94b7-e18966924d82" (UID: "f8638656-9d65-4c13-94b7-e18966924d82"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.084924 4717 scope.go:117] "RemoveContainer" containerID="984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.107571 4717 scope.go:117] "RemoveContainer" containerID="c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.107787 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqmbr\" (UniqueName: \"kubernetes.io/projected/f8638656-9d65-4c13-94b7-e18966924d82-kube-api-access-cqmbr\") on node \"crc\" DevicePath \"\"" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.107817 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.107828 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8638656-9d65-4c13-94b7-e18966924d82-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.151852 4717 scope.go:117] "RemoveContainer" containerID="bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5" Oct 07 15:24:10 crc kubenswrapper[4717]: E1007 15:24:10.152437 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5\": container with ID starting with bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5 not found: ID does not exist" containerID="bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.152498 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5"} err="failed to get container status \"bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5\": rpc error: code = NotFound desc = could not find container \"bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5\": container with ID starting with bc6a80c8716d993b1668011449954702e2157c87b1bd165e141883c8ab838cc5 not found: ID does not exist" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.152528 4717 scope.go:117] "RemoveContainer" containerID="984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313" Oct 07 15:24:10 crc kubenswrapper[4717]: E1007 15:24:10.153199 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313\": container with ID starting with 984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313 not found: ID does not exist" containerID="984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.153237 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313"} err="failed to get container status \"984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313\": rpc error: code = NotFound desc = could not find container \"984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313\": container with ID starting with 984ebea973b0446b945eaa44cc8d783db5744c3dc3fb1b09ba74ece9b6782313 not found: ID does not exist" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.153259 4717 scope.go:117] "RemoveContainer" containerID="c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02" Oct 07 15:24:10 crc kubenswrapper[4717]: E1007 15:24:10.153510 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02\": container with ID starting with c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02 not found: ID does not exist" containerID="c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.153536 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02"} err="failed to get container status \"c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02\": rpc error: code = NotFound desc = could not find container \"c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02\": container with ID starting with c0c901ae50442c017fea32715ad5ff7662f0187939a4814e9f0c84e55acfaa02 not found: ID does not exist" Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.375849 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vznr9"] Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.387829 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vznr9"] Oct 07 15:24:10 crc kubenswrapper[4717]: I1007 15:24:10.885707 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8638656-9d65-4c13-94b7-e18966924d82" path="/var/lib/kubelet/pods/f8638656-9d65-4c13-94b7-e18966924d82/volumes" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.362982 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z4dql/must-gather-qlcqc"] Oct 07 15:24:17 crc kubenswrapper[4717]: E1007 15:24:17.363945 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8638656-9d65-4c13-94b7-e18966924d82" containerName="extract-content" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.363962 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8638656-9d65-4c13-94b7-e18966924d82" containerName="extract-content" Oct 07 15:24:17 crc kubenswrapper[4717]: E1007 15:24:17.363999 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8638656-9d65-4c13-94b7-e18966924d82" containerName="registry-server" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.364019 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8638656-9d65-4c13-94b7-e18966924d82" containerName="registry-server" Oct 07 15:24:17 crc kubenswrapper[4717]: E1007 15:24:17.364030 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8638656-9d65-4c13-94b7-e18966924d82" containerName="extract-utilities" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.364037 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8638656-9d65-4c13-94b7-e18966924d82" containerName="extract-utilities" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.364236 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8638656-9d65-4c13-94b7-e18966924d82" containerName="registry-server" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.365252 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/must-gather-qlcqc" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.372305 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-z4dql"/"openshift-service-ca.crt" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.373721 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-z4dql"/"kube-root-ca.crt" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.382190 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z4dql/must-gather-qlcqc"] Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.557353 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxg78\" (UniqueName: \"kubernetes.io/projected/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-kube-api-access-zxg78\") pod \"must-gather-qlcqc\" (UID: \"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4\") " pod="openshift-must-gather-z4dql/must-gather-qlcqc" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.557531 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-must-gather-output\") pod \"must-gather-qlcqc\" (UID: \"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4\") " pod="openshift-must-gather-z4dql/must-gather-qlcqc" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.659426 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-must-gather-output\") pod \"must-gather-qlcqc\" (UID: \"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4\") " pod="openshift-must-gather-z4dql/must-gather-qlcqc" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.659539 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxg78\" (UniqueName: \"kubernetes.io/projected/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-kube-api-access-zxg78\") pod \"must-gather-qlcqc\" (UID: \"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4\") " pod="openshift-must-gather-z4dql/must-gather-qlcqc" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.660154 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-must-gather-output\") pod \"must-gather-qlcqc\" (UID: \"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4\") " pod="openshift-must-gather-z4dql/must-gather-qlcqc" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.690226 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxg78\" (UniqueName: \"kubernetes.io/projected/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-kube-api-access-zxg78\") pod \"must-gather-qlcqc\" (UID: \"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4\") " pod="openshift-must-gather-z4dql/must-gather-qlcqc" Oct 07 15:24:17 crc kubenswrapper[4717]: I1007 15:24:17.692944 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/must-gather-qlcqc" Oct 07 15:24:18 crc kubenswrapper[4717]: I1007 15:24:18.194598 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-z4dql/must-gather-qlcqc"] Oct 07 15:24:19 crc kubenswrapper[4717]: I1007 15:24:19.121517 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/must-gather-qlcqc" event={"ID":"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4","Type":"ContainerStarted","Data":"a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e"} Oct 07 15:24:19 crc kubenswrapper[4717]: I1007 15:24:19.121907 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/must-gather-qlcqc" event={"ID":"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4","Type":"ContainerStarted","Data":"58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96"} Oct 07 15:24:19 crc kubenswrapper[4717]: I1007 15:24:19.121924 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/must-gather-qlcqc" event={"ID":"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4","Type":"ContainerStarted","Data":"cfad8e7cda7f7f00ae9c4928f4a8c2a9752f9f6f40068c78e2d8127f7be9422b"} Oct 07 15:24:19 crc kubenswrapper[4717]: I1007 15:24:19.143201 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-z4dql/must-gather-qlcqc" podStartSLOduration=2.143177282 podStartE2EDuration="2.143177282s" podCreationTimestamp="2025-10-07 15:24:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:24:19.138978637 +0000 UTC m=+5440.966904429" watchObservedRunningTime="2025-10-07 15:24:19.143177282 +0000 UTC m=+5440.971103074" Oct 07 15:24:22 crc kubenswrapper[4717]: I1007 15:24:22.751657 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z4dql/crc-debug-j5ld8"] Oct 07 15:24:22 crc kubenswrapper[4717]: I1007 15:24:22.753651 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-j5ld8" Oct 07 15:24:22 crc kubenswrapper[4717]: I1007 15:24:22.756513 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-z4dql"/"default-dockercfg-rmm8m" Oct 07 15:24:22 crc kubenswrapper[4717]: I1007 15:24:22.764253 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq8gq\" (UniqueName: \"kubernetes.io/projected/331ac57d-0952-4d58-988e-a0c22138ee33-kube-api-access-wq8gq\") pod \"crc-debug-j5ld8\" (UID: \"331ac57d-0952-4d58-988e-a0c22138ee33\") " pod="openshift-must-gather-z4dql/crc-debug-j5ld8" Oct 07 15:24:22 crc kubenswrapper[4717]: I1007 15:24:22.764335 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/331ac57d-0952-4d58-988e-a0c22138ee33-host\") pod \"crc-debug-j5ld8\" (UID: \"331ac57d-0952-4d58-988e-a0c22138ee33\") " pod="openshift-must-gather-z4dql/crc-debug-j5ld8" Oct 07 15:24:22 crc kubenswrapper[4717]: I1007 15:24:22.866867 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq8gq\" (UniqueName: \"kubernetes.io/projected/331ac57d-0952-4d58-988e-a0c22138ee33-kube-api-access-wq8gq\") pod \"crc-debug-j5ld8\" (UID: \"331ac57d-0952-4d58-988e-a0c22138ee33\") " pod="openshift-must-gather-z4dql/crc-debug-j5ld8" Oct 07 15:24:22 crc kubenswrapper[4717]: I1007 15:24:22.866973 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/331ac57d-0952-4d58-988e-a0c22138ee33-host\") pod \"crc-debug-j5ld8\" (UID: \"331ac57d-0952-4d58-988e-a0c22138ee33\") " pod="openshift-must-gather-z4dql/crc-debug-j5ld8" Oct 07 15:24:22 crc kubenswrapper[4717]: I1007 15:24:22.867184 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/331ac57d-0952-4d58-988e-a0c22138ee33-host\") pod \"crc-debug-j5ld8\" (UID: \"331ac57d-0952-4d58-988e-a0c22138ee33\") " pod="openshift-must-gather-z4dql/crc-debug-j5ld8" Oct 07 15:24:22 crc kubenswrapper[4717]: I1007 15:24:22.895747 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq8gq\" (UniqueName: \"kubernetes.io/projected/331ac57d-0952-4d58-988e-a0c22138ee33-kube-api-access-wq8gq\") pod \"crc-debug-j5ld8\" (UID: \"331ac57d-0952-4d58-988e-a0c22138ee33\") " pod="openshift-must-gather-z4dql/crc-debug-j5ld8" Oct 07 15:24:23 crc kubenswrapper[4717]: I1007 15:24:23.089668 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-j5ld8" Oct 07 15:24:23 crc kubenswrapper[4717]: I1007 15:24:23.176450 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/crc-debug-j5ld8" event={"ID":"331ac57d-0952-4d58-988e-a0c22138ee33","Type":"ContainerStarted","Data":"d3e344e6193305d90cef627bf414f3313fd5215da55a2cd3f928856e97a3bf2d"} Oct 07 15:24:24 crc kubenswrapper[4717]: I1007 15:24:24.186713 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/crc-debug-j5ld8" event={"ID":"331ac57d-0952-4d58-988e-a0c22138ee33","Type":"ContainerStarted","Data":"8312e13c60b649f89c2629b4ae5811360019e63768a314c1b5bf3bd768b04c80"} Oct 07 15:24:24 crc kubenswrapper[4717]: I1007 15:24:24.210820 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-z4dql/crc-debug-j5ld8" podStartSLOduration=2.210799029 podStartE2EDuration="2.210799029s" podCreationTimestamp="2025-10-07 15:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:24:24.201448822 +0000 UTC m=+5446.029374614" watchObservedRunningTime="2025-10-07 15:24:24.210799029 +0000 UTC m=+5446.038724831" Oct 07 15:24:31 crc kubenswrapper[4717]: I1007 15:24:31.610035 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:24:31 crc kubenswrapper[4717]: I1007 15:24:31.610648 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:25:01 crc kubenswrapper[4717]: I1007 15:25:01.610088 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:25:01 crc kubenswrapper[4717]: I1007 15:25:01.610604 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:25:29 crc kubenswrapper[4717]: I1007 15:25:29.183273 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbb74df64-qfwg8_2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb/barbican-api/0.log" Oct 07 15:25:29 crc kubenswrapper[4717]: I1007 15:25:29.231942 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7cbb74df64-qfwg8_2afbba6c-b8ea-42ae-bbe1-a6b2fa4c14bb/barbican-api-log/0.log" Oct 07 15:25:29 crc kubenswrapper[4717]: I1007 15:25:29.396303 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8474564558-fss6t_237c55e8-afd6-4798-b0cf-7f8b20b0e323/barbican-keystone-listener/0.log" Oct 07 15:25:29 crc kubenswrapper[4717]: I1007 15:25:29.697220 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-b648ffc67-ks8w7_c96f41ab-26d3-44e4-8ad3-732b104a09df/barbican-worker/0.log" Oct 07 15:25:29 crc kubenswrapper[4717]: I1007 15:25:29.893045 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-b648ffc67-ks8w7_c96f41ab-26d3-44e4-8ad3-732b104a09df/barbican-worker-log/0.log" Oct 07 15:25:30 crc kubenswrapper[4717]: I1007 15:25:30.146646 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-mhbkp_026b85b1-41cc-4a6f-9638-909bc0e6099e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:30 crc kubenswrapper[4717]: I1007 15:25:30.213839 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-8474564558-fss6t_237c55e8-afd6-4798-b0cf-7f8b20b0e323/barbican-keystone-listener-log/0.log" Oct 07 15:25:30 crc kubenswrapper[4717]: I1007 15:25:30.471810 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3c0fa622-b0b4-4100-b7fa-82e3821d1f76/ceilometer-central-agent/0.log" Oct 07 15:25:30 crc kubenswrapper[4717]: I1007 15:25:30.507233 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3c0fa622-b0b4-4100-b7fa-82e3821d1f76/ceilometer-notification-agent/0.log" Oct 07 15:25:30 crc kubenswrapper[4717]: I1007 15:25:30.527656 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3c0fa622-b0b4-4100-b7fa-82e3821d1f76/proxy-httpd/0.log" Oct 07 15:25:30 crc kubenswrapper[4717]: I1007 15:25:30.667917 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3c0fa622-b0b4-4100-b7fa-82e3821d1f76/sg-core/0.log" Oct 07 15:25:30 crc kubenswrapper[4717]: I1007 15:25:30.890764 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph_cd7e0d52-4d10-4898-8949-7f3dc9875fe9/ceph/0.log" Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.200190 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2ad858d-25ea-41d2-8499-188e72ca0873/cinder-api/0.log" Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.350806 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c2ad858d-25ea-41d2-8499-188e72ca0873/cinder-api-log/0.log" Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.535592 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d6be3c5a-b5c5-49a0-ae43-f90c44cf6496/probe/0.log" Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.611411 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.611470 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.611515 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.612921 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"41f8855bc62d051328978794adb7bc50b38200d7a849c0f95f4da2ee5cc17e66"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.612985 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://41f8855bc62d051328978794adb7bc50b38200d7a849c0f95f4da2ee5cc17e66" gracePeriod=600 Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.858233 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="41f8855bc62d051328978794adb7bc50b38200d7a849c0f95f4da2ee5cc17e66" exitCode=0 Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.858614 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"41f8855bc62d051328978794adb7bc50b38200d7a849c0f95f4da2ee5cc17e66"} Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.858684 4717 scope.go:117] "RemoveContainer" containerID="d72d258cbc17d2cf6a9f6ec26c695519ef8f6f02af19cf93cd7dd26772889c8a" Oct 07 15:25:31 crc kubenswrapper[4717]: I1007 15:25:31.968181 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_348f889a-8a84-4e54-81cf-46ee269e85d9/cinder-scheduler/0.log" Oct 07 15:25:32 crc kubenswrapper[4717]: I1007 15:25:32.028318 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_348f889a-8a84-4e54-81cf-46ee269e85d9/probe/0.log" Oct 07 15:25:32 crc kubenswrapper[4717]: I1007 15:25:32.627644 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_315f03ca-ba00-4899-b836-72bc9a1970eb/probe/0.log" Oct 07 15:25:32 crc kubenswrapper[4717]: I1007 15:25:32.893142 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerStarted","Data":"b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a"} Oct 07 15:25:33 crc kubenswrapper[4717]: I1007 15:25:33.165147 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-cjwg5_c4007561-6cfe-400e-81b9-d60b36d79171/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:33 crc kubenswrapper[4717]: I1007 15:25:33.408512 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-7r5z6_27f71967-9d43-4b11-a286-1544c15adc41/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:33 crc kubenswrapper[4717]: I1007 15:25:33.642214 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dppcv_b7e5a28d-f380-447f-998f-5e65280d3651/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:33 crc kubenswrapper[4717]: I1007 15:25:33.833885 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-wm5sb_ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6/init/0.log" Oct 07 15:25:34 crc kubenswrapper[4717]: I1007 15:25:34.066908 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-wm5sb_ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6/init/0.log" Oct 07 15:25:34 crc kubenswrapper[4717]: I1007 15:25:34.288613 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5bb847fbb7-wm5sb_ddfc15fb-6212-46a6-b1d9-f0a2f1f15fd6/dnsmasq-dns/0.log" Oct 07 15:25:34 crc kubenswrapper[4717]: I1007 15:25:34.558962 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-7rs8l_769125a8-8870-4ddf-86e3-cb1bfa198b41/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:34 crc kubenswrapper[4717]: I1007 15:25:34.684425 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d6be3c5a-b5c5-49a0-ae43-f90c44cf6496/cinder-backup/0.log" Oct 07 15:25:34 crc kubenswrapper[4717]: I1007 15:25:34.754055 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c19f33db-3446-415d-8d32-8f18ac112e2e/glance-httpd/0.log" Oct 07 15:25:34 crc kubenswrapper[4717]: I1007 15:25:34.811520 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c19f33db-3446-415d-8d32-8f18ac112e2e/glance-log/0.log" Oct 07 15:25:35 crc kubenswrapper[4717]: I1007 15:25:35.028319 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b5f3e51b-cc26-48b7-ace9-206022bfc021/glance-log/0.log" Oct 07 15:25:35 crc kubenswrapper[4717]: I1007 15:25:35.110862 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b5f3e51b-cc26-48b7-ace9-206022bfc021/glance-httpd/0.log" Oct 07 15:25:35 crc kubenswrapper[4717]: I1007 15:25:35.325119 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_315f03ca-ba00-4899-b836-72bc9a1970eb/cinder-volume/0.log" Oct 07 15:25:35 crc kubenswrapper[4717]: I1007 15:25:35.375937 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-ff6468b6d-j9vqb_fabca24d-42a9-45e6-81ca-ad04bf8bd588/horizon/0.log" Oct 07 15:25:35 crc kubenswrapper[4717]: I1007 15:25:35.613776 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-kljwc_fda137df-3fa6-470a-b41a-db9f55a550ab/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:35 crc kubenswrapper[4717]: I1007 15:25:35.775819 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xfj57_bbf34104-46e2-4650-bdd0-f3f8cfb6d590/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:35 crc kubenswrapper[4717]: I1007 15:25:35.953714 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-ff6468b6d-j9vqb_fabca24d-42a9-45e6-81ca-ad04bf8bd588/horizon-log/0.log" Oct 07 15:25:36 crc kubenswrapper[4717]: I1007 15:25:36.147414 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29330821-ncmw5_6cd311ec-9f03-47a4-8e27-5e4553bf4c1d/keystone-cron/0.log" Oct 07 15:25:36 crc kubenswrapper[4717]: I1007 15:25:36.371352 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9bcc7eb6-640f-4300-9943-2ba004773b3b/kube-state-metrics/0.log" Oct 07 15:25:36 crc kubenswrapper[4717]: I1007 15:25:36.482099 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wbxjz_73f6a900-f08d-4207-b89d-d8acfd404b8d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:37 crc kubenswrapper[4717]: I1007 15:25:37.420744 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_3351608d-f560-40bb-a1f3-c8711a80a7a4/manila-api/0.log" Oct 07 15:25:37 crc kubenswrapper[4717]: I1007 15:25:37.626987 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_4dd7329a-b653-4574-864a-9d86cbb87ed3/manila-scheduler/0.log" Oct 07 15:25:37 crc kubenswrapper[4717]: I1007 15:25:37.651930 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_4dd7329a-b653-4574-864a-9d86cbb87ed3/probe/0.log" Oct 07 15:25:37 crc kubenswrapper[4717]: I1007 15:25:37.837347 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_3351608d-f560-40bb-a1f3-c8711a80a7a4/manila-api-log/0.log" Oct 07 15:25:38 crc kubenswrapper[4717]: I1007 15:25:38.063105 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_7fc0839c-9b96-43c5-9111-5d9c24b471b9/probe/0.log" Oct 07 15:25:38 crc kubenswrapper[4717]: I1007 15:25:38.254369 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_7fc0839c-9b96-43c5-9111-5d9c24b471b9/manila-share/0.log" Oct 07 15:25:39 crc kubenswrapper[4717]: I1007 15:25:39.326993 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-74d44865f4-vrndk_230706b7-0a7f-4b69-973f-4e9c1253e7b9/keystone-api/0.log" Oct 07 15:25:39 crc kubenswrapper[4717]: I1007 15:25:39.476498 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59986d7f85-m2phf_c8606e01-73db-4dce-92b0-89d47762aa09/neutron-httpd/0.log" Oct 07 15:25:39 crc kubenswrapper[4717]: I1007 15:25:39.681711 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-xsf9w_310bfdcc-9c71-4075-b8d3-af7c21dc3165/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:39 crc kubenswrapper[4717]: I1007 15:25:39.805839 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-59986d7f85-m2phf_c8606e01-73db-4dce-92b0-89d47762aa09/neutron-api/0.log" Oct 07 15:25:41 crc kubenswrapper[4717]: I1007 15:25:41.015821 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_cb822ac3-55c1-4745-bc6d-570b89e66108/nova-cell0-conductor-conductor/0.log" Oct 07 15:25:41 crc kubenswrapper[4717]: I1007 15:25:41.729233 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_6d3e722a-09cb-4b09-856e-b4752de9e30e/nova-cell1-conductor-conductor/0.log" Oct 07 15:25:42 crc kubenswrapper[4717]: I1007 15:25:42.374067 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_d4241d67-38b0-4e1c-83d6-9a0531e6d902/nova-cell1-novncproxy-novncproxy/0.log" Oct 07 15:25:42 crc kubenswrapper[4717]: I1007 15:25:42.408238 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_975022f5-b6f2-4d3f-adbb-f4b878cd2758/nova-api-log/0.log" Oct 07 15:25:42 crc kubenswrapper[4717]: I1007 15:25:42.790897 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-xxwpx_44a1daee-eac0-4c51-ae29-1afa919bcb68/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:43 crc kubenswrapper[4717]: I1007 15:25:43.130327 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0c62351c-f8eb-4014-8391-229d81411849/nova-metadata-log/0.log" Oct 07 15:25:43 crc kubenswrapper[4717]: I1007 15:25:43.183468 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_975022f5-b6f2-4d3f-adbb-f4b878cd2758/nova-api-api/0.log" Oct 07 15:25:43 crc kubenswrapper[4717]: I1007 15:25:43.869680 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3c49618-d6f6-4379-8ac1-b474b0ffdeea/mysql-bootstrap/0.log" Oct 07 15:25:44 crc kubenswrapper[4717]: I1007 15:25:44.009781 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_beebd3d0-0aca-4953-9f04-ea98632ae71b/nova-scheduler-scheduler/0.log" Oct 07 15:25:44 crc kubenswrapper[4717]: I1007 15:25:44.069114 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3c49618-d6f6-4379-8ac1-b474b0ffdeea/mysql-bootstrap/0.log" Oct 07 15:25:44 crc kubenswrapper[4717]: I1007 15:25:44.268224 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e3c49618-d6f6-4379-8ac1-b474b0ffdeea/galera/0.log" Oct 07 15:25:44 crc kubenswrapper[4717]: I1007 15:25:44.542792 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3eaa6cb6-249b-4f92-8942-9c60eee866e8/mysql-bootstrap/0.log" Oct 07 15:25:44 crc kubenswrapper[4717]: I1007 15:25:44.701249 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3eaa6cb6-249b-4f92-8942-9c60eee866e8/mysql-bootstrap/0.log" Oct 07 15:25:44 crc kubenswrapper[4717]: I1007 15:25:44.793668 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3eaa6cb6-249b-4f92-8942-9c60eee866e8/galera/0.log" Oct 07 15:25:45 crc kubenswrapper[4717]: I1007 15:25:45.009458 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_688fe58a-bc1a-40c9-8fdd-bf5a8956ccd2/openstackclient/0.log" Oct 07 15:25:45 crc kubenswrapper[4717]: I1007 15:25:45.195599 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-lh5mj_190219f8-b7b5-4cbd-ab5b-3fd1880f9eef/openstack-network-exporter/0.log" Oct 07 15:25:45 crc kubenswrapper[4717]: I1007 15:25:45.458435 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wvknm_1c9dfdb0-e3ce-4c80-903f-006f20eacf29/ovsdb-server-init/0.log" Oct 07 15:25:45 crc kubenswrapper[4717]: I1007 15:25:45.499757 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_0c62351c-f8eb-4014-8391-229d81411849/nova-metadata-metadata/0.log" Oct 07 15:25:45 crc kubenswrapper[4717]: I1007 15:25:45.676202 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wvknm_1c9dfdb0-e3ce-4c80-903f-006f20eacf29/ovs-vswitchd/0.log" Oct 07 15:25:45 crc kubenswrapper[4717]: I1007 15:25:45.715160 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wvknm_1c9dfdb0-e3ce-4c80-903f-006f20eacf29/ovsdb-server/0.log" Oct 07 15:25:45 crc kubenswrapper[4717]: I1007 15:25:45.737144 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wvknm_1c9dfdb0-e3ce-4c80-903f-006f20eacf29/ovsdb-server-init/0.log" Oct 07 15:25:45 crc kubenswrapper[4717]: I1007 15:25:45.947559 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vr6v8_f9c414ed-d3f6-42cc-8ab0-14eab36e7d0c/ovn-controller/0.log" Oct 07 15:25:46 crc kubenswrapper[4717]: I1007 15:25:46.181947 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-xgpx2_29dd3c38-62bb-4f7c-9cef-7ab420156b0c/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:46 crc kubenswrapper[4717]: I1007 15:25:46.280987 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_57dca108-f9e5-443a-aa97-01263cf96863/openstack-network-exporter/0.log" Oct 07 15:25:46 crc kubenswrapper[4717]: I1007 15:25:46.425194 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_57dca108-f9e5-443a-aa97-01263cf96863/ovn-northd/0.log" Oct 07 15:25:46 crc kubenswrapper[4717]: I1007 15:25:46.583959 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_272dcaf8-29ce-4329-8301-4123eea773dc/openstack-network-exporter/0.log" Oct 07 15:25:46 crc kubenswrapper[4717]: I1007 15:25:46.674741 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_272dcaf8-29ce-4329-8301-4123eea773dc/ovsdbserver-nb/0.log" Oct 07 15:25:46 crc kubenswrapper[4717]: I1007 15:25:46.890318 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d203dc4-3d0b-4e7c-b38b-96f231f12071/openstack-network-exporter/0.log" Oct 07 15:25:46 crc kubenswrapper[4717]: I1007 15:25:46.914416 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2d203dc4-3d0b-4e7c-b38b-96f231f12071/ovsdbserver-sb/0.log" Oct 07 15:25:47 crc kubenswrapper[4717]: I1007 15:25:47.293454 4717 scope.go:117] "RemoveContainer" containerID="5e1e4a1df678ae1d12d66b5f9c2c9bd4f8ff9bd0360bb3a682a1bef7fe1dcedf" Oct 07 15:25:47 crc kubenswrapper[4717]: I1007 15:25:47.617971 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5848b6684d-cddjh_bd72a516-7ee9-4ff2-a7b0-f928a10e676d/placement-api/0.log" Oct 07 15:25:47 crc kubenswrapper[4717]: I1007 15:25:47.676756 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4472de66-ea08-4251-856b-4cb130e7cf1b/setup-container/0.log" Oct 07 15:25:47 crc kubenswrapper[4717]: I1007 15:25:47.769476 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5848b6684d-cddjh_bd72a516-7ee9-4ff2-a7b0-f928a10e676d/placement-log/0.log" Oct 07 15:25:47 crc kubenswrapper[4717]: I1007 15:25:47.925134 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4472de66-ea08-4251-856b-4cb130e7cf1b/setup-container/0.log" Oct 07 15:25:48 crc kubenswrapper[4717]: I1007 15:25:48.000025 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4472de66-ea08-4251-856b-4cb130e7cf1b/rabbitmq/0.log" Oct 07 15:25:48 crc kubenswrapper[4717]: I1007 15:25:48.191236 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d6ac461e-73e2-4268-8a00-6faee58bae2b/setup-container/0.log" Oct 07 15:25:48 crc kubenswrapper[4717]: I1007 15:25:48.418348 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d6ac461e-73e2-4268-8a00-6faee58bae2b/setup-container/0.log" Oct 07 15:25:48 crc kubenswrapper[4717]: I1007 15:25:48.425190 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_d6ac461e-73e2-4268-8a00-6faee58bae2b/rabbitmq/0.log" Oct 07 15:25:48 crc kubenswrapper[4717]: I1007 15:25:48.659701 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-w7fn5_bf9bf972-fbac-4b24-bf35-2cf668fca79d/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:48 crc kubenswrapper[4717]: I1007 15:25:48.709236 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-24mzn_394bf866-16b7-4c6a-a729-0a716c1bb5de/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:48 crc kubenswrapper[4717]: I1007 15:25:48.965554 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-tvmzx_1da9937b-d5ca-4f21-b803-ef9121b48f23/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:49 crc kubenswrapper[4717]: I1007 15:25:49.173374 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-9xqjk_c4d5f4c4-7e2a-46b8-8331-3372d6a7e825/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:49 crc kubenswrapper[4717]: I1007 15:25:49.230788 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-d9g97_17107a9c-f715-4bd1-88ac-d79c769fd4e4/ssh-known-hosts-edpm-deployment/0.log" Oct 07 15:25:49 crc kubenswrapper[4717]: I1007 15:25:49.566827 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9896cd659-vvdxn_f44baabf-f1c4-4036-8c6b-ce32cc6cf541/proxy-server/0.log" Oct 07 15:25:49 crc kubenswrapper[4717]: I1007 15:25:49.705840 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-9896cd659-vvdxn_f44baabf-f1c4-4036-8c6b-ce32cc6cf541/proxy-httpd/0.log" Oct 07 15:25:49 crc kubenswrapper[4717]: I1007 15:25:49.796285 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-jgbc8_bf613982-6142-4491-931b-ad2e2b2b637f/swift-ring-rebalance/0.log" Oct 07 15:25:49 crc kubenswrapper[4717]: I1007 15:25:49.974217 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/account-auditor/0.log" Oct 07 15:25:50 crc kubenswrapper[4717]: I1007 15:25:50.110310 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/account-reaper/0.log" Oct 07 15:25:50 crc kubenswrapper[4717]: I1007 15:25:50.240135 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/account-replicator/0.log" Oct 07 15:25:50 crc kubenswrapper[4717]: I1007 15:25:50.296405 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/container-auditor/0.log" Oct 07 15:25:50 crc kubenswrapper[4717]: I1007 15:25:50.301675 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/account-server/0.log" Oct 07 15:25:50 crc kubenswrapper[4717]: I1007 15:25:50.506802 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/container-server/0.log" Oct 07 15:25:50 crc kubenswrapper[4717]: I1007 15:25:50.538263 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/container-replicator/0.log" Oct 07 15:25:50 crc kubenswrapper[4717]: I1007 15:25:50.596140 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/container-updater/0.log" Oct 07 15:25:50 crc kubenswrapper[4717]: I1007 15:25:50.790086 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/object-auditor/0.log" Oct 07 15:25:50 crc kubenswrapper[4717]: I1007 15:25:50.820324 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/object-expirer/0.log" Oct 07 15:25:50 crc kubenswrapper[4717]: I1007 15:25:50.882308 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/object-replicator/0.log" Oct 07 15:25:51 crc kubenswrapper[4717]: I1007 15:25:51.032523 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/object-server/0.log" Oct 07 15:25:51 crc kubenswrapper[4717]: I1007 15:25:51.096050 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/object-updater/0.log" Oct 07 15:25:51 crc kubenswrapper[4717]: I1007 15:25:51.121031 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/rsync/0.log" Oct 07 15:25:51 crc kubenswrapper[4717]: I1007 15:25:51.281622 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_4d8f5328-2247-4e10-8d15-9902887bd75f/swift-recon-cron/0.log" Oct 07 15:25:51 crc kubenswrapper[4717]: I1007 15:25:51.375610 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-d6xxj_87e2107a-8eec-497a-b811-7d339dbfe176/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:51 crc kubenswrapper[4717]: I1007 15:25:51.693173 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_1ce6c960-67bf-4ed7-b3bc-b8bdbf53a3d7/tempest-tests-tempest-tests-runner/0.log" Oct 07 15:25:51 crc kubenswrapper[4717]: I1007 15:25:51.832749 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_5f86d81b-0adf-45b3-815e-4ba7029d821d/test-operator-logs-container/0.log" Oct 07 15:25:51 crc kubenswrapper[4717]: I1007 15:25:51.918338 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-zthb2_7499d2ef-057a-4267-9725-bb62675d9eb8/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 07 15:25:58 crc kubenswrapper[4717]: I1007 15:25:58.368407 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3a693669-0da0-46aa-a110-75593768011d/memcached/0.log" Oct 07 15:26:15 crc kubenswrapper[4717]: I1007 15:26:15.405493 4717 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-9896cd659-vvdxn" podUID="f44baabf-f1c4-4036-8c6b-ce32cc6cf541" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Oct 07 15:26:29 crc kubenswrapper[4717]: I1007 15:26:29.415289 4717 generic.go:334] "Generic (PLEG): container finished" podID="331ac57d-0952-4d58-988e-a0c22138ee33" containerID="8312e13c60b649f89c2629b4ae5811360019e63768a314c1b5bf3bd768b04c80" exitCode=0 Oct 07 15:26:29 crc kubenswrapper[4717]: I1007 15:26:29.415363 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/crc-debug-j5ld8" event={"ID":"331ac57d-0952-4d58-988e-a0c22138ee33","Type":"ContainerDied","Data":"8312e13c60b649f89c2629b4ae5811360019e63768a314c1b5bf3bd768b04c80"} Oct 07 15:26:30 crc kubenswrapper[4717]: I1007 15:26:30.557846 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-j5ld8" Oct 07 15:26:30 crc kubenswrapper[4717]: I1007 15:26:30.592819 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z4dql/crc-debug-j5ld8"] Oct 07 15:26:30 crc kubenswrapper[4717]: I1007 15:26:30.600280 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z4dql/crc-debug-j5ld8"] Oct 07 15:26:30 crc kubenswrapper[4717]: I1007 15:26:30.661618 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq8gq\" (UniqueName: \"kubernetes.io/projected/331ac57d-0952-4d58-988e-a0c22138ee33-kube-api-access-wq8gq\") pod \"331ac57d-0952-4d58-988e-a0c22138ee33\" (UID: \"331ac57d-0952-4d58-988e-a0c22138ee33\") " Oct 07 15:26:30 crc kubenswrapper[4717]: I1007 15:26:30.661689 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/331ac57d-0952-4d58-988e-a0c22138ee33-host\") pod \"331ac57d-0952-4d58-988e-a0c22138ee33\" (UID: \"331ac57d-0952-4d58-988e-a0c22138ee33\") " Oct 07 15:26:30 crc kubenswrapper[4717]: I1007 15:26:30.662211 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/331ac57d-0952-4d58-988e-a0c22138ee33-host" (OuterVolumeSpecName: "host") pod "331ac57d-0952-4d58-988e-a0c22138ee33" (UID: "331ac57d-0952-4d58-988e-a0c22138ee33"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:26:30 crc kubenswrapper[4717]: I1007 15:26:30.669327 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331ac57d-0952-4d58-988e-a0c22138ee33-kube-api-access-wq8gq" (OuterVolumeSpecName: "kube-api-access-wq8gq") pod "331ac57d-0952-4d58-988e-a0c22138ee33" (UID: "331ac57d-0952-4d58-988e-a0c22138ee33"). InnerVolumeSpecName "kube-api-access-wq8gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:26:30 crc kubenswrapper[4717]: I1007 15:26:30.763887 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq8gq\" (UniqueName: \"kubernetes.io/projected/331ac57d-0952-4d58-988e-a0c22138ee33-kube-api-access-wq8gq\") on node \"crc\" DevicePath \"\"" Oct 07 15:26:30 crc kubenswrapper[4717]: I1007 15:26:30.763922 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/331ac57d-0952-4d58-988e-a0c22138ee33-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:26:30 crc kubenswrapper[4717]: I1007 15:26:30.880084 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331ac57d-0952-4d58-988e-a0c22138ee33" path="/var/lib/kubelet/pods/331ac57d-0952-4d58-988e-a0c22138ee33/volumes" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.436259 4717 scope.go:117] "RemoveContainer" containerID="8312e13c60b649f89c2629b4ae5811360019e63768a314c1b5bf3bd768b04c80" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.436316 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-j5ld8" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.755693 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z4dql/crc-debug-645kk"] Oct 07 15:26:31 crc kubenswrapper[4717]: E1007 15:26:31.756410 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331ac57d-0952-4d58-988e-a0c22138ee33" containerName="container-00" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.756427 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="331ac57d-0952-4d58-988e-a0c22138ee33" containerName="container-00" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.756663 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="331ac57d-0952-4d58-988e-a0c22138ee33" containerName="container-00" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.757341 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-645kk" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.759700 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-z4dql"/"default-dockercfg-rmm8m" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.886699 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34afb7b6-1029-4836-aadd-da07d86e2a61-host\") pod \"crc-debug-645kk\" (UID: \"34afb7b6-1029-4836-aadd-da07d86e2a61\") " pod="openshift-must-gather-z4dql/crc-debug-645kk" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.886872 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5gs5\" (UniqueName: \"kubernetes.io/projected/34afb7b6-1029-4836-aadd-da07d86e2a61-kube-api-access-p5gs5\") pod \"crc-debug-645kk\" (UID: \"34afb7b6-1029-4836-aadd-da07d86e2a61\") " pod="openshift-must-gather-z4dql/crc-debug-645kk" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.989031 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5gs5\" (UniqueName: \"kubernetes.io/projected/34afb7b6-1029-4836-aadd-da07d86e2a61-kube-api-access-p5gs5\") pod \"crc-debug-645kk\" (UID: \"34afb7b6-1029-4836-aadd-da07d86e2a61\") " pod="openshift-must-gather-z4dql/crc-debug-645kk" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.989375 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34afb7b6-1029-4836-aadd-da07d86e2a61-host\") pod \"crc-debug-645kk\" (UID: \"34afb7b6-1029-4836-aadd-da07d86e2a61\") " pod="openshift-must-gather-z4dql/crc-debug-645kk" Oct 07 15:26:31 crc kubenswrapper[4717]: I1007 15:26:31.990022 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34afb7b6-1029-4836-aadd-da07d86e2a61-host\") pod \"crc-debug-645kk\" (UID: \"34afb7b6-1029-4836-aadd-da07d86e2a61\") " pod="openshift-must-gather-z4dql/crc-debug-645kk" Oct 07 15:26:32 crc kubenswrapper[4717]: I1007 15:26:32.011309 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5gs5\" (UniqueName: \"kubernetes.io/projected/34afb7b6-1029-4836-aadd-da07d86e2a61-kube-api-access-p5gs5\") pod \"crc-debug-645kk\" (UID: \"34afb7b6-1029-4836-aadd-da07d86e2a61\") " pod="openshift-must-gather-z4dql/crc-debug-645kk" Oct 07 15:26:32 crc kubenswrapper[4717]: I1007 15:26:32.072986 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-645kk" Oct 07 15:26:32 crc kubenswrapper[4717]: W1007 15:26:32.099803 4717 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34afb7b6_1029_4836_aadd_da07d86e2a61.slice/crio-3625caf5645f90c63ae24f61bf1cfd1a563584bb04429bd917b89fe335d5c106 WatchSource:0}: Error finding container 3625caf5645f90c63ae24f61bf1cfd1a563584bb04429bd917b89fe335d5c106: Status 404 returned error can't find the container with id 3625caf5645f90c63ae24f61bf1cfd1a563584bb04429bd917b89fe335d5c106 Oct 07 15:26:32 crc kubenswrapper[4717]: I1007 15:26:32.446847 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/crc-debug-645kk" event={"ID":"34afb7b6-1029-4836-aadd-da07d86e2a61","Type":"ContainerStarted","Data":"fee311800d70a569d4aa744787c935303f98dd43f5d0aa1fd2bfc59ebacfb3f7"} Oct 07 15:26:32 crc kubenswrapper[4717]: I1007 15:26:32.447156 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/crc-debug-645kk" event={"ID":"34afb7b6-1029-4836-aadd-da07d86e2a61","Type":"ContainerStarted","Data":"3625caf5645f90c63ae24f61bf1cfd1a563584bb04429bd917b89fe335d5c106"} Oct 07 15:26:32 crc kubenswrapper[4717]: I1007 15:26:32.461915 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-z4dql/crc-debug-645kk" podStartSLOduration=1.461894552 podStartE2EDuration="1.461894552s" podCreationTimestamp="2025-10-07 15:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:26:32.458485428 +0000 UTC m=+5574.286411230" watchObservedRunningTime="2025-10-07 15:26:32.461894552 +0000 UTC m=+5574.289820344" Oct 07 15:26:33 crc kubenswrapper[4717]: I1007 15:26:33.457523 4717 generic.go:334] "Generic (PLEG): container finished" podID="34afb7b6-1029-4836-aadd-da07d86e2a61" containerID="fee311800d70a569d4aa744787c935303f98dd43f5d0aa1fd2bfc59ebacfb3f7" exitCode=0 Oct 07 15:26:33 crc kubenswrapper[4717]: I1007 15:26:33.457573 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/crc-debug-645kk" event={"ID":"34afb7b6-1029-4836-aadd-da07d86e2a61","Type":"ContainerDied","Data":"fee311800d70a569d4aa744787c935303f98dd43f5d0aa1fd2bfc59ebacfb3f7"} Oct 07 15:26:34 crc kubenswrapper[4717]: I1007 15:26:34.564418 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-645kk" Oct 07 15:26:34 crc kubenswrapper[4717]: I1007 15:26:34.632800 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5gs5\" (UniqueName: \"kubernetes.io/projected/34afb7b6-1029-4836-aadd-da07d86e2a61-kube-api-access-p5gs5\") pod \"34afb7b6-1029-4836-aadd-da07d86e2a61\" (UID: \"34afb7b6-1029-4836-aadd-da07d86e2a61\") " Oct 07 15:26:34 crc kubenswrapper[4717]: I1007 15:26:34.632864 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34afb7b6-1029-4836-aadd-da07d86e2a61-host\") pod \"34afb7b6-1029-4836-aadd-da07d86e2a61\" (UID: \"34afb7b6-1029-4836-aadd-da07d86e2a61\") " Oct 07 15:26:34 crc kubenswrapper[4717]: I1007 15:26:34.633266 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34afb7b6-1029-4836-aadd-da07d86e2a61-host" (OuterVolumeSpecName: "host") pod "34afb7b6-1029-4836-aadd-da07d86e2a61" (UID: "34afb7b6-1029-4836-aadd-da07d86e2a61"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:26:34 crc kubenswrapper[4717]: I1007 15:26:34.633714 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34afb7b6-1029-4836-aadd-da07d86e2a61-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:26:34 crc kubenswrapper[4717]: I1007 15:26:34.638221 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34afb7b6-1029-4836-aadd-da07d86e2a61-kube-api-access-p5gs5" (OuterVolumeSpecName: "kube-api-access-p5gs5") pod "34afb7b6-1029-4836-aadd-da07d86e2a61" (UID: "34afb7b6-1029-4836-aadd-da07d86e2a61"). InnerVolumeSpecName "kube-api-access-p5gs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:26:34 crc kubenswrapper[4717]: I1007 15:26:34.735755 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5gs5\" (UniqueName: \"kubernetes.io/projected/34afb7b6-1029-4836-aadd-da07d86e2a61-kube-api-access-p5gs5\") on node \"crc\" DevicePath \"\"" Oct 07 15:26:35 crc kubenswrapper[4717]: I1007 15:26:35.474229 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/crc-debug-645kk" event={"ID":"34afb7b6-1029-4836-aadd-da07d86e2a61","Type":"ContainerDied","Data":"3625caf5645f90c63ae24f61bf1cfd1a563584bb04429bd917b89fe335d5c106"} Oct 07 15:26:35 crc kubenswrapper[4717]: I1007 15:26:35.474516 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3625caf5645f90c63ae24f61bf1cfd1a563584bb04429bd917b89fe335d5c106" Oct 07 15:26:35 crc kubenswrapper[4717]: I1007 15:26:35.474267 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-645kk" Oct 07 15:26:42 crc kubenswrapper[4717]: I1007 15:26:42.438001 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z4dql/crc-debug-645kk"] Oct 07 15:26:42 crc kubenswrapper[4717]: I1007 15:26:42.448737 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z4dql/crc-debug-645kk"] Oct 07 15:26:42 crc kubenswrapper[4717]: I1007 15:26:42.878977 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34afb7b6-1029-4836-aadd-da07d86e2a61" path="/var/lib/kubelet/pods/34afb7b6-1029-4836-aadd-da07d86e2a61/volumes" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.600768 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-z4dql/crc-debug-xx2jp"] Oct 07 15:26:43 crc kubenswrapper[4717]: E1007 15:26:43.601860 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34afb7b6-1029-4836-aadd-da07d86e2a61" containerName="container-00" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.601879 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="34afb7b6-1029-4836-aadd-da07d86e2a61" containerName="container-00" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.602126 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="34afb7b6-1029-4836-aadd-da07d86e2a61" containerName="container-00" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.603070 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-xx2jp" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.606037 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-z4dql"/"default-dockercfg-rmm8m" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.703131 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db162da2-b4ad-44ec-aee8-022d704d5922-host\") pod \"crc-debug-xx2jp\" (UID: \"db162da2-b4ad-44ec-aee8-022d704d5922\") " pod="openshift-must-gather-z4dql/crc-debug-xx2jp" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.703723 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrcc\" (UniqueName: \"kubernetes.io/projected/db162da2-b4ad-44ec-aee8-022d704d5922-kube-api-access-hnrcc\") pod \"crc-debug-xx2jp\" (UID: \"db162da2-b4ad-44ec-aee8-022d704d5922\") " pod="openshift-must-gather-z4dql/crc-debug-xx2jp" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.805948 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db162da2-b4ad-44ec-aee8-022d704d5922-host\") pod \"crc-debug-xx2jp\" (UID: \"db162da2-b4ad-44ec-aee8-022d704d5922\") " pod="openshift-must-gather-z4dql/crc-debug-xx2jp" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.806111 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db162da2-b4ad-44ec-aee8-022d704d5922-host\") pod \"crc-debug-xx2jp\" (UID: \"db162da2-b4ad-44ec-aee8-022d704d5922\") " pod="openshift-must-gather-z4dql/crc-debug-xx2jp" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.806493 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnrcc\" (UniqueName: \"kubernetes.io/projected/db162da2-b4ad-44ec-aee8-022d704d5922-kube-api-access-hnrcc\") pod \"crc-debug-xx2jp\" (UID: \"db162da2-b4ad-44ec-aee8-022d704d5922\") " pod="openshift-must-gather-z4dql/crc-debug-xx2jp" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.829311 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnrcc\" (UniqueName: \"kubernetes.io/projected/db162da2-b4ad-44ec-aee8-022d704d5922-kube-api-access-hnrcc\") pod \"crc-debug-xx2jp\" (UID: \"db162da2-b4ad-44ec-aee8-022d704d5922\") " pod="openshift-must-gather-z4dql/crc-debug-xx2jp" Oct 07 15:26:43 crc kubenswrapper[4717]: I1007 15:26:43.923515 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-xx2jp" Oct 07 15:26:44 crc kubenswrapper[4717]: I1007 15:26:44.542390 4717 generic.go:334] "Generic (PLEG): container finished" podID="db162da2-b4ad-44ec-aee8-022d704d5922" containerID="10fbafc70bf19eec58b66834e3c9efaaf2a0fd73cb457b964311996d59f12989" exitCode=0 Oct 07 15:26:44 crc kubenswrapper[4717]: I1007 15:26:44.542493 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/crc-debug-xx2jp" event={"ID":"db162da2-b4ad-44ec-aee8-022d704d5922","Type":"ContainerDied","Data":"10fbafc70bf19eec58b66834e3c9efaaf2a0fd73cb457b964311996d59f12989"} Oct 07 15:26:44 crc kubenswrapper[4717]: I1007 15:26:44.542716 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/crc-debug-xx2jp" event={"ID":"db162da2-b4ad-44ec-aee8-022d704d5922","Type":"ContainerStarted","Data":"25f74b0f3067a89aeb1d94557b51ecf62a6670a232afd8671a8ee3118f5aabaa"} Oct 07 15:26:44 crc kubenswrapper[4717]: I1007 15:26:44.577462 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z4dql/crc-debug-xx2jp"] Oct 07 15:26:44 crc kubenswrapper[4717]: I1007 15:26:44.584875 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z4dql/crc-debug-xx2jp"] Oct 07 15:26:45 crc kubenswrapper[4717]: I1007 15:26:45.671174 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-xx2jp" Oct 07 15:26:45 crc kubenswrapper[4717]: I1007 15:26:45.748306 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnrcc\" (UniqueName: \"kubernetes.io/projected/db162da2-b4ad-44ec-aee8-022d704d5922-kube-api-access-hnrcc\") pod \"db162da2-b4ad-44ec-aee8-022d704d5922\" (UID: \"db162da2-b4ad-44ec-aee8-022d704d5922\") " Oct 07 15:26:45 crc kubenswrapper[4717]: I1007 15:26:45.748457 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db162da2-b4ad-44ec-aee8-022d704d5922-host\") pod \"db162da2-b4ad-44ec-aee8-022d704d5922\" (UID: \"db162da2-b4ad-44ec-aee8-022d704d5922\") " Oct 07 15:26:45 crc kubenswrapper[4717]: I1007 15:26:45.748559 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/db162da2-b4ad-44ec-aee8-022d704d5922-host" (OuterVolumeSpecName: "host") pod "db162da2-b4ad-44ec-aee8-022d704d5922" (UID: "db162da2-b4ad-44ec-aee8-022d704d5922"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 07 15:26:45 crc kubenswrapper[4717]: I1007 15:26:45.748947 4717 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db162da2-b4ad-44ec-aee8-022d704d5922-host\") on node \"crc\" DevicePath \"\"" Oct 07 15:26:45 crc kubenswrapper[4717]: I1007 15:26:45.754177 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db162da2-b4ad-44ec-aee8-022d704d5922-kube-api-access-hnrcc" (OuterVolumeSpecName: "kube-api-access-hnrcc") pod "db162da2-b4ad-44ec-aee8-022d704d5922" (UID: "db162da2-b4ad-44ec-aee8-022d704d5922"). InnerVolumeSpecName "kube-api-access-hnrcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:26:45 crc kubenswrapper[4717]: I1007 15:26:45.851512 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnrcc\" (UniqueName: \"kubernetes.io/projected/db162da2-b4ad-44ec-aee8-022d704d5922-kube-api-access-hnrcc\") on node \"crc\" DevicePath \"\"" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.088510 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/util/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.252058 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/util/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.260824 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/pull/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.289329 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/pull/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.439669 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/util/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.450915 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/pull/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.453323 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a68ed66bd0cfd1ac4e58df5e386f9a9e23c9f91eb2e8376f0a2e61881bqhkgg_955d7278-00a6-496d-824f-423681b6d873/extract/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.562438 4717 scope.go:117] "RemoveContainer" containerID="10fbafc70bf19eec58b66834e3c9efaaf2a0fd73cb457b964311996d59f12989" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.562482 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/crc-debug-xx2jp" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.641444 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-xd2vv_303da62c-428b-4494-9ad6-4168652dcd4e/kube-rbac-proxy/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.703944 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-58c4cd55f4-xd2vv_303da62c-428b-4494-9ad6-4168652dcd4e/manager/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.792377 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-sd746_c56f0658-961c-4727-8ba8-5d24af8523dd/kube-rbac-proxy/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.880059 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db162da2-b4ad-44ec-aee8-022d704d5922" path="/var/lib/kubelet/pods/db162da2-b4ad-44ec-aee8-022d704d5922/volumes" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.885482 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-7d4d4f8d-sd746_c56f0658-961c-4727-8ba8-5d24af8523dd/manager/0.log" Oct 07 15:26:46 crc kubenswrapper[4717]: I1007 15:26:46.918339 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-pfxp7_ee9e55bf-132a-44ef-82ca-0e1ac422afd3/kube-rbac-proxy/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.037069 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-75dfd9b554-pfxp7_ee9e55bf-132a-44ef-82ca-0e1ac422afd3/manager/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.105691 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-8xb4k_0aa98a33-f3d0-49ee-8909-a88e320aa26c/kube-rbac-proxy/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.207129 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5dc44df7d5-8xb4k_0aa98a33-f3d0-49ee-8909-a88e320aa26c/manager/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.309925 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-8zlpx_cfd500c0-e80e-4553-affd-c3b5437d67b7/kube-rbac-proxy/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.344349 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-54b4974c45-8zlpx_cfd500c0-e80e-4553-affd-c3b5437d67b7/manager/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.445850 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-q8qkj_acb68432-c913-4c77-bfc8-ef15f9e74a1c/kube-rbac-proxy/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.522561 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-76d5b87f47-q8qkj_acb68432-c913-4c77-bfc8-ef15f9e74a1c/manager/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.639268 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-b4l9q_47df3c50-6937-4479-8463-4d6816e354d4/kube-rbac-proxy/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.823479 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-658588b8c9-b4l9q_47df3c50-6937-4479-8463-4d6816e354d4/manager/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.825332 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-jdbmx_b4fabdda-208b-48e9-b4d9-85e638c74ad4/kube-rbac-proxy/0.log" Oct 07 15:26:47 crc kubenswrapper[4717]: I1007 15:26:47.884149 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-649675d675-jdbmx_b4fabdda-208b-48e9-b4d9-85e638c74ad4/manager/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.007787 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-8xkj4_d7327559-8f4c-4745-acde-51a8ec9ca67a/kube-rbac-proxy/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.064898 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b5ccf6d9c-8xkj4_d7327559-8f4c-4745-acde-51a8ec9ca67a/manager/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.153112 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-7stm5_763bbe43-a00e-4a82-b90c-bd56ab6a516a/kube-rbac-proxy/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.241929 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-65d89cfd9f-7stm5_763bbe43-a00e-4a82-b90c-bd56ab6a516a/manager/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.281401 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-znbv7_d0d37d45-9608-4b1d-97b0-62f8b36ab834/kube-rbac-proxy/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.384288 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6cd6d7bdf5-znbv7_d0d37d45-9608-4b1d-97b0-62f8b36ab834/manager/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.501808 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-f7qpd_1edc1259-6db8-4b89-86fa-6410a3d5931d/kube-rbac-proxy/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.514299 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-8d984cc4d-f7qpd_1edc1259-6db8-4b89-86fa-6410a3d5931d/manager/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.642175 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-28ncp_dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f/kube-rbac-proxy/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.754621 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7c7fc454ff-28ncp_dc3aaf5e-15d2-4c91-9367-ea9647d2fd3f/manager/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.858693 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-vlk87_af994010-9d35-4fcd-b444-64acb6b65577/kube-rbac-proxy/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.917662 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7468f855d8-vlk87_af994010-9d35-4fcd-b444-64acb6b65577/manager/0.log" Oct 07 15:26:48 crc kubenswrapper[4717]: I1007 15:26:48.985344 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg_399188c6-e2ce-4c19-93ac-aec1a685d28c/kube-rbac-proxy/0.log" Oct 07 15:26:49 crc kubenswrapper[4717]: I1007 15:26:49.066161 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5dfbbd665ch94jg_399188c6-e2ce-4c19-93ac-aec1a685d28c/manager/0.log" Oct 07 15:26:49 crc kubenswrapper[4717]: I1007 15:26:49.165656 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-574b968964-27nb9_a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19/kube-rbac-proxy/0.log" Oct 07 15:26:49 crc kubenswrapper[4717]: I1007 15:26:49.393173 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5fc947cc4b-xh5qp_da883109-9f27-447d-aa39-aa7dcf80f19f/kube-rbac-proxy/0.log" Oct 07 15:26:49 crc kubenswrapper[4717]: I1007 15:26:49.604088 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-g6w94_a28c8b75-8908-4152-8e0c-2805e594a4b7/registry-server/0.log" Oct 07 15:26:49 crc kubenswrapper[4717]: I1007 15:26:49.670881 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5fc947cc4b-xh5qp_da883109-9f27-447d-aa39-aa7dcf80f19f/operator/0.log" Oct 07 15:26:49 crc kubenswrapper[4717]: I1007 15:26:49.800597 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-x5psc_9af7f241-0d0a-457e-9332-51d88b1a52d1/kube-rbac-proxy/0.log" Oct 07 15:26:49 crc kubenswrapper[4717]: I1007 15:26:49.938430 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-6d8b6f9b9-x5psc_9af7f241-0d0a-457e-9332-51d88b1a52d1/manager/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.078572 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-4dfwm_b2ac5ce4-5123-4d82-aca2-a776d4f89f09/manager/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.091811 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-54689d9f88-4dfwm_b2ac5ce4-5123-4d82-aca2-a776d4f89f09/kube-rbac-proxy/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.232448 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-z65d2_6c49d800-49ef-4cb3-894a-632b519b22a8/operator/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.340185 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-slwnb_21e03a5b-f9f2-4e57-90d3-03edcbb7e2db/kube-rbac-proxy/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.367856 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-574b968964-27nb9_a300e117-c1c8-4b5f-a37a-f7ce0e2b4f19/manager/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.441302 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-6859f9b676-slwnb_21e03a5b-f9f2-4e57-90d3-03edcbb7e2db/manager/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.534214 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-bxcrn_c046dbff-c7b3-464b-97c7-ae47f24bcd61/kube-rbac-proxy/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.603068 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5d4d74dd89-bxcrn_c046dbff-c7b3-464b-97c7-ae47f24bcd61/manager/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.642991 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-xsmnx_1b5c23ef-d224-468b-bc22-2d98de7c4132/kube-rbac-proxy/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.693470 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd5cb47d7-xsmnx_1b5c23ef-d224-468b-bc22-2d98de7c4132/manager/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.799019 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-gtf2z_9f385c04-ccd4-4526-ba50-53c7c637a0d3/manager/0.log" Oct 07 15:26:50 crc kubenswrapper[4717]: I1007 15:26:50.828620 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6cbc6dd547-gtf2z_9f385c04-ccd4-4526-ba50-53c7c637a0d3/kube-rbac-proxy/0.log" Oct 07 15:27:04 crc kubenswrapper[4717]: I1007 15:27:04.866288 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8qs4m_46171099-e5d7-49a0-8e63-f8b9d3f8b0d8/control-plane-machine-set-operator/0.log" Oct 07 15:27:05 crc kubenswrapper[4717]: I1007 15:27:05.033664 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-79gpc_c3436d60-53c5-4994-88f1-d3aa555e96bd/kube-rbac-proxy/0.log" Oct 07 15:27:05 crc kubenswrapper[4717]: I1007 15:27:05.084888 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-79gpc_c3436d60-53c5-4994-88f1-d3aa555e96bd/machine-api-operator/0.log" Oct 07 15:27:16 crc kubenswrapper[4717]: I1007 15:27:16.360303 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-cjl9f_2865143d-019f-4b4d-950a-c346856cfb7a/cert-manager-controller/0.log" Oct 07 15:27:16 crc kubenswrapper[4717]: I1007 15:27:16.643811 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-kg4r6_3c320f2d-2ff6-4ca8-824a-c866f83684f5/cert-manager-cainjector/0.log" Oct 07 15:27:16 crc kubenswrapper[4717]: I1007 15:27:16.734997 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-z4t9q_a9fab481-f23d-4515-a94d-15fac56b0032/cert-manager-webhook/0.log" Oct 07 15:27:28 crc kubenswrapper[4717]: I1007 15:27:28.797320 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-x6gc5_af496732-9d8f-4872-b431-87ae5dc74691/nmstate-console-plugin/0.log" Oct 07 15:27:28 crc kubenswrapper[4717]: I1007 15:27:28.990990 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5hlst_8083afc6-9711-4408-bd7c-d92138300930/nmstate-handler/0.log" Oct 07 15:27:29 crc kubenswrapper[4717]: I1007 15:27:29.033052 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-rlbn4_df498fdc-f1f6-4fe0-8362-bb061a651a0a/kube-rbac-proxy/0.log" Oct 07 15:27:29 crc kubenswrapper[4717]: I1007 15:27:29.109533 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-rlbn4_df498fdc-f1f6-4fe0-8362-bb061a651a0a/nmstate-metrics/0.log" Oct 07 15:27:29 crc kubenswrapper[4717]: I1007 15:27:29.205305 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-pdn48_840561ca-9862-4a00-af4e-9e870d51efa4/nmstate-operator/0.log" Oct 07 15:27:29 crc kubenswrapper[4717]: I1007 15:27:29.298844 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-wlmfn_f70faa25-3fd3-4e7d-bf27-03a163483cb3/nmstate-webhook/0.log" Oct 07 15:27:42 crc kubenswrapper[4717]: I1007 15:27:42.625893 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rnp2p_9d7d987f-b766-481c-bcd5-ef6ec6e32956/kube-rbac-proxy/0.log" Oct 07 15:27:42 crc kubenswrapper[4717]: I1007 15:27:42.750343 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-rnp2p_9d7d987f-b766-481c-bcd5-ef6ec6e32956/controller/0.log" Oct 07 15:27:42 crc kubenswrapper[4717]: I1007 15:27:42.813829 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-frr-files/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.025941 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-frr-files/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.054802 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-reloader/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.059734 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-metrics/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.105424 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-reloader/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.289537 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-frr-files/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.317861 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-reloader/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.328231 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-metrics/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.333955 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-metrics/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.486517 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-reloader/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.513925 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-frr-files/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.573188 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/cp-metrics/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.594591 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/controller/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.704944 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/frr-metrics/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.819269 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/kube-rbac-proxy-frr/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.821408 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/kube-rbac-proxy/0.log" Oct 07 15:27:43 crc kubenswrapper[4717]: I1007 15:27:43.957443 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/reloader/0.log" Oct 07 15:27:44 crc kubenswrapper[4717]: I1007 15:27:44.083852 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-t4gtn_c6ca3b03-14b6-45f2-828e-35d06c8455b9/frr-k8s-webhook-server/0.log" Oct 07 15:27:44 crc kubenswrapper[4717]: I1007 15:27:44.333924 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b67b56d8d-7vj44_2b39e023-1e34-4d26-8001-ce161b5c0dbd/manager/0.log" Oct 07 15:27:44 crc kubenswrapper[4717]: I1007 15:27:44.498352 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-749b845cbd-g6bgz_0d8c8d26-60f0-4cf2-b139-fc06825e1ed4/webhook-server/0.log" Oct 07 15:27:44 crc kubenswrapper[4717]: I1007 15:27:44.614027 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wdrxn_89f792c4-3345-4538-8445-39f6ebfb784c/kube-rbac-proxy/0.log" Oct 07 15:27:45 crc kubenswrapper[4717]: I1007 15:27:45.329003 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-wdrxn_89f792c4-3345-4538-8445-39f6ebfb784c/speaker/0.log" Oct 07 15:27:45 crc kubenswrapper[4717]: I1007 15:27:45.555022 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-b4l79_a309b6fb-e8f4-4cea-b9cf-9ae4b4fa9cf6/frr/0.log" Oct 07 15:27:56 crc kubenswrapper[4717]: I1007 15:27:56.363878 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/util/0.log" Oct 07 15:27:56 crc kubenswrapper[4717]: I1007 15:27:56.565673 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/pull/0.log" Oct 07 15:27:56 crc kubenswrapper[4717]: I1007 15:27:56.577159 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/util/0.log" Oct 07 15:27:56 crc kubenswrapper[4717]: I1007 15:27:56.596743 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/pull/0.log" Oct 07 15:27:56 crc kubenswrapper[4717]: I1007 15:27:56.763951 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/util/0.log" Oct 07 15:27:56 crc kubenswrapper[4717]: I1007 15:27:56.791673 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/extract/0.log" Oct 07 15:27:56 crc kubenswrapper[4717]: I1007 15:27:56.807748 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2l4mjh_0ab0c8d3-4ea7-48b7-9ec7-5cbfe2fff5c0/pull/0.log" Oct 07 15:27:56 crc kubenswrapper[4717]: I1007 15:27:56.955418 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-utilities/0.log" Oct 07 15:27:57 crc kubenswrapper[4717]: I1007 15:27:57.157370 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-utilities/0.log" Oct 07 15:27:57 crc kubenswrapper[4717]: I1007 15:27:57.165198 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-content/0.log" Oct 07 15:27:57 crc kubenswrapper[4717]: I1007 15:27:57.166619 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-content/0.log" Oct 07 15:27:57 crc kubenswrapper[4717]: I1007 15:27:57.343823 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-utilities/0.log" Oct 07 15:27:57 crc kubenswrapper[4717]: I1007 15:27:57.351220 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/extract-content/0.log" Oct 07 15:27:57 crc kubenswrapper[4717]: I1007 15:27:57.635307 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-6hxfl_be55c7be-af1c-44bc-ac2c-ac1db8fb4a82/registry-server/0.log" Oct 07 15:27:57 crc kubenswrapper[4717]: I1007 15:27:57.681829 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-utilities/0.log" Oct 07 15:27:58 crc kubenswrapper[4717]: I1007 15:27:58.009068 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-content/0.log" Oct 07 15:27:58 crc kubenswrapper[4717]: I1007 15:27:58.053157 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-utilities/0.log" Oct 07 15:27:58 crc kubenswrapper[4717]: I1007 15:27:58.096731 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-content/0.log" Oct 07 15:27:58 crc kubenswrapper[4717]: I1007 15:27:58.239767 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-utilities/0.log" Oct 07 15:27:58 crc kubenswrapper[4717]: I1007 15:27:58.292495 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/extract-content/0.log" Oct 07 15:27:58 crc kubenswrapper[4717]: I1007 15:27:58.588056 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/util/0.log" Oct 07 15:27:58 crc kubenswrapper[4717]: I1007 15:27:58.915840 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/util/0.log" Oct 07 15:27:58 crc kubenswrapper[4717]: I1007 15:27:58.917948 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/pull/0.log" Oct 07 15:27:58 crc kubenswrapper[4717]: I1007 15:27:58.920630 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/pull/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.176102 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/pull/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.186462 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/util/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.189332 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxh5w_5a62f706-d9c9-407c-ba19-8556dfa331f4/registry-server/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.191877 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c5z98b_d11c6b3d-b42a-4487-8149-216b5b9b2afd/extract/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.358721 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-nztkh_56aec528-e156-45c5-ac1a-d55cc129894c/marketplace-operator/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.386518 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-utilities/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.597861 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-content/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.606954 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-utilities/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.607034 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-content/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.777353 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-utilities/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.806986 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/extract-content/0.log" Oct 07 15:27:59 crc kubenswrapper[4717]: I1007 15:27:59.982113 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-utilities/0.log" Oct 07 15:28:00 crc kubenswrapper[4717]: I1007 15:28:00.044444 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-bgtcz_65715644-72b4-4b0e-895c-7d8a39fa8f87/registry-server/0.log" Oct 07 15:28:00 crc kubenswrapper[4717]: I1007 15:28:00.173975 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-utilities/0.log" Oct 07 15:28:00 crc kubenswrapper[4717]: I1007 15:28:00.196564 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-content/0.log" Oct 07 15:28:00 crc kubenswrapper[4717]: I1007 15:28:00.196753 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-content/0.log" Oct 07 15:28:00 crc kubenswrapper[4717]: I1007 15:28:00.396062 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-utilities/0.log" Oct 07 15:28:00 crc kubenswrapper[4717]: I1007 15:28:00.410300 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/extract-content/0.log" Oct 07 15:28:00 crc kubenswrapper[4717]: I1007 15:28:00.960534 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-5dsf7_4f395ebc-03d9-4b9c-a38c-aa8222f3b7f2/registry-server/0.log" Oct 07 15:28:01 crc kubenswrapper[4717]: I1007 15:28:01.609664 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:28:01 crc kubenswrapper[4717]: I1007 15:28:01.610027 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:28:31 crc kubenswrapper[4717]: I1007 15:28:31.610198 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:28:31 crc kubenswrapper[4717]: I1007 15:28:31.610738 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:29:01 crc kubenswrapper[4717]: I1007 15:29:01.610082 4717 patch_prober.go:28] interesting pod/machine-config-daemon-2f4zj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 07 15:29:01 crc kubenswrapper[4717]: I1007 15:29:01.610629 4717 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 07 15:29:01 crc kubenswrapper[4717]: I1007 15:29:01.610705 4717 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" Oct 07 15:29:01 crc kubenswrapper[4717]: I1007 15:29:01.611809 4717 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a"} pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 07 15:29:01 crc kubenswrapper[4717]: I1007 15:29:01.611874 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerName="machine-config-daemon" containerID="cri-o://b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" gracePeriod=600 Oct 07 15:29:01 crc kubenswrapper[4717]: E1007 15:29:01.738037 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:29:01 crc kubenswrapper[4717]: I1007 15:29:01.770731 4717 generic.go:334] "Generic (PLEG): container finished" podID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" exitCode=0 Oct 07 15:29:01 crc kubenswrapper[4717]: I1007 15:29:01.770775 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" event={"ID":"2f0e0c90-54cc-4aac-9c56-ad711d2d69a6","Type":"ContainerDied","Data":"b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a"} Oct 07 15:29:01 crc kubenswrapper[4717]: I1007 15:29:01.770804 4717 scope.go:117] "RemoveContainer" containerID="41f8855bc62d051328978794adb7bc50b38200d7a849c0f95f4da2ee5cc17e66" Oct 07 15:29:01 crc kubenswrapper[4717]: I1007 15:29:01.771435 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:29:01 crc kubenswrapper[4717]: E1007 15:29:01.771681 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:29:15 crc kubenswrapper[4717]: I1007 15:29:15.869370 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:29:15 crc kubenswrapper[4717]: E1007 15:29:15.870979 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:29:29 crc kubenswrapper[4717]: I1007 15:29:29.869374 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:29:29 crc kubenswrapper[4717]: E1007 15:29:29.871593 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:29:44 crc kubenswrapper[4717]: I1007 15:29:44.869283 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:29:44 crc kubenswrapper[4717]: E1007 15:29:44.870223 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:29:55 crc kubenswrapper[4717]: I1007 15:29:55.868822 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:29:55 crc kubenswrapper[4717]: E1007 15:29:55.869599 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.153467 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq"] Oct 07 15:30:00 crc kubenswrapper[4717]: E1007 15:30:00.154877 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db162da2-b4ad-44ec-aee8-022d704d5922" containerName="container-00" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.154902 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="db162da2-b4ad-44ec-aee8-022d704d5922" containerName="container-00" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.155194 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="db162da2-b4ad-44ec-aee8-022d704d5922" containerName="container-00" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.156077 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.159589 4717 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.159758 4717 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.176604 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq"] Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.240429 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b8d88bc-90f4-46b2-8292-ce5820e824fd-secret-volume\") pod \"collect-profiles-29330850-7kjgq\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.240501 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7tcq\" (UniqueName: \"kubernetes.io/projected/9b8d88bc-90f4-46b2-8292-ce5820e824fd-kube-api-access-m7tcq\") pod \"collect-profiles-29330850-7kjgq\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.240545 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b8d88bc-90f4-46b2-8292-ce5820e824fd-config-volume\") pod \"collect-profiles-29330850-7kjgq\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.342260 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7tcq\" (UniqueName: \"kubernetes.io/projected/9b8d88bc-90f4-46b2-8292-ce5820e824fd-kube-api-access-m7tcq\") pod \"collect-profiles-29330850-7kjgq\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.342587 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b8d88bc-90f4-46b2-8292-ce5820e824fd-config-volume\") pod \"collect-profiles-29330850-7kjgq\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.342742 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b8d88bc-90f4-46b2-8292-ce5820e824fd-secret-volume\") pod \"collect-profiles-29330850-7kjgq\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.343933 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b8d88bc-90f4-46b2-8292-ce5820e824fd-config-volume\") pod \"collect-profiles-29330850-7kjgq\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.354148 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b8d88bc-90f4-46b2-8292-ce5820e824fd-secret-volume\") pod \"collect-profiles-29330850-7kjgq\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.359048 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7tcq\" (UniqueName: \"kubernetes.io/projected/9b8d88bc-90f4-46b2-8292-ce5820e824fd-kube-api-access-m7tcq\") pod \"collect-profiles-29330850-7kjgq\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:00 crc kubenswrapper[4717]: I1007 15:30:00.484954 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:01 crc kubenswrapper[4717]: I1007 15:30:01.022073 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq"] Oct 07 15:30:01 crc kubenswrapper[4717]: I1007 15:30:01.357537 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" event={"ID":"9b8d88bc-90f4-46b2-8292-ce5820e824fd","Type":"ContainerStarted","Data":"30329210e7e2b3457ce63628b22ef82b92098648727482feaaec3862bf3331c8"} Oct 07 15:30:01 crc kubenswrapper[4717]: I1007 15:30:01.357843 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" event={"ID":"9b8d88bc-90f4-46b2-8292-ce5820e824fd","Type":"ContainerStarted","Data":"14f011cd65abf5ae4c5db57b649545101b980ebfe217ce71951fd911c6e96249"} Oct 07 15:30:01 crc kubenswrapper[4717]: I1007 15:30:01.383349 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" podStartSLOduration=1.38332616 podStartE2EDuration="1.38332616s" podCreationTimestamp="2025-10-07 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-07 15:30:01.372996686 +0000 UTC m=+5783.200922498" watchObservedRunningTime="2025-10-07 15:30:01.38332616 +0000 UTC m=+5783.211251952" Oct 07 15:30:02 crc kubenswrapper[4717]: I1007 15:30:02.367384 4717 generic.go:334] "Generic (PLEG): container finished" podID="9b8d88bc-90f4-46b2-8292-ce5820e824fd" containerID="30329210e7e2b3457ce63628b22ef82b92098648727482feaaec3862bf3331c8" exitCode=0 Oct 07 15:30:02 crc kubenswrapper[4717]: I1007 15:30:02.367523 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" event={"ID":"9b8d88bc-90f4-46b2-8292-ce5820e824fd","Type":"ContainerDied","Data":"30329210e7e2b3457ce63628b22ef82b92098648727482feaaec3862bf3331c8"} Oct 07 15:30:03 crc kubenswrapper[4717]: I1007 15:30:03.896387 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:03 crc kubenswrapper[4717]: I1007 15:30:03.936738 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7tcq\" (UniqueName: \"kubernetes.io/projected/9b8d88bc-90f4-46b2-8292-ce5820e824fd-kube-api-access-m7tcq\") pod \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " Oct 07 15:30:03 crc kubenswrapper[4717]: I1007 15:30:03.936962 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b8d88bc-90f4-46b2-8292-ce5820e824fd-secret-volume\") pod \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " Oct 07 15:30:03 crc kubenswrapper[4717]: I1007 15:30:03.937141 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b8d88bc-90f4-46b2-8292-ce5820e824fd-config-volume\") pod \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\" (UID: \"9b8d88bc-90f4-46b2-8292-ce5820e824fd\") " Oct 07 15:30:03 crc kubenswrapper[4717]: I1007 15:30:03.939173 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b8d88bc-90f4-46b2-8292-ce5820e824fd-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b8d88bc-90f4-46b2-8292-ce5820e824fd" (UID: "9b8d88bc-90f4-46b2-8292-ce5820e824fd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 07 15:30:03 crc kubenswrapper[4717]: I1007 15:30:03.944197 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b8d88bc-90f4-46b2-8292-ce5820e824fd-kube-api-access-m7tcq" (OuterVolumeSpecName: "kube-api-access-m7tcq") pod "9b8d88bc-90f4-46b2-8292-ce5820e824fd" (UID: "9b8d88bc-90f4-46b2-8292-ce5820e824fd"). InnerVolumeSpecName "kube-api-access-m7tcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:30:03 crc kubenswrapper[4717]: I1007 15:30:03.954785 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b8d88bc-90f4-46b2-8292-ce5820e824fd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b8d88bc-90f4-46b2-8292-ce5820e824fd" (UID: "9b8d88bc-90f4-46b2-8292-ce5820e824fd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 07 15:30:04 crc kubenswrapper[4717]: I1007 15:30:04.039209 4717 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b8d88bc-90f4-46b2-8292-ce5820e824fd-config-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:30:04 crc kubenswrapper[4717]: I1007 15:30:04.039248 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7tcq\" (UniqueName: \"kubernetes.io/projected/9b8d88bc-90f4-46b2-8292-ce5820e824fd-kube-api-access-m7tcq\") on node \"crc\" DevicePath \"\"" Oct 07 15:30:04 crc kubenswrapper[4717]: I1007 15:30:04.039261 4717 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b8d88bc-90f4-46b2-8292-ce5820e824fd-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 07 15:30:04 crc kubenswrapper[4717]: I1007 15:30:04.401304 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" event={"ID":"9b8d88bc-90f4-46b2-8292-ce5820e824fd","Type":"ContainerDied","Data":"14f011cd65abf5ae4c5db57b649545101b980ebfe217ce71951fd911c6e96249"} Oct 07 15:30:04 crc kubenswrapper[4717]: I1007 15:30:04.401345 4717 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14f011cd65abf5ae4c5db57b649545101b980ebfe217ce71951fd911c6e96249" Oct 07 15:30:04 crc kubenswrapper[4717]: I1007 15:30:04.401400 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29330850-7kjgq" Oct 07 15:30:04 crc kubenswrapper[4717]: I1007 15:30:04.452429 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf"] Oct 07 15:30:04 crc kubenswrapper[4717]: I1007 15:30:04.460060 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29330805-kvzcf"] Oct 07 15:30:04 crc kubenswrapper[4717]: I1007 15:30:04.887896 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba418081-7600-4056-9dbf-b322deceee98" path="/var/lib/kubelet/pods/ba418081-7600-4056-9dbf-b322deceee98/volumes" Oct 07 15:30:10 crc kubenswrapper[4717]: I1007 15:30:10.868962 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:30:10 crc kubenswrapper[4717]: E1007 15:30:10.869648 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:30:24 crc kubenswrapper[4717]: I1007 15:30:24.868892 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:30:24 crc kubenswrapper[4717]: E1007 15:30:24.869637 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:30:34 crc kubenswrapper[4717]: I1007 15:30:34.670810 4717 generic.go:334] "Generic (PLEG): container finished" podID="ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" containerID="58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96" exitCode=0 Oct 07 15:30:34 crc kubenswrapper[4717]: I1007 15:30:34.670904 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-z4dql/must-gather-qlcqc" event={"ID":"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4","Type":"ContainerDied","Data":"58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96"} Oct 07 15:30:34 crc kubenswrapper[4717]: I1007 15:30:34.672058 4717 scope.go:117] "RemoveContainer" containerID="58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96" Oct 07 15:30:35 crc kubenswrapper[4717]: I1007 15:30:35.134962 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z4dql_must-gather-qlcqc_ccac2f3f-65cc-45b0-baf4-9894ba8f09f4/gather/0.log" Oct 07 15:30:38 crc kubenswrapper[4717]: I1007 15:30:38.877622 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:30:38 crc kubenswrapper[4717]: E1007 15:30:38.878583 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:30:47 crc kubenswrapper[4717]: I1007 15:30:47.493072 4717 scope.go:117] "RemoveContainer" containerID="99ec3f4f3eb99d1e23e778823ad1333cd7a0e053be9351bd985b1f71979245f7" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.346269 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-z4dql/must-gather-qlcqc"] Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.346502 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-z4dql/must-gather-qlcqc" podUID="ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" containerName="copy" containerID="cri-o://a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e" gracePeriod=2 Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.366127 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-z4dql/must-gather-qlcqc"] Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.793132 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z4dql_must-gather-qlcqc_ccac2f3f-65cc-45b0-baf4-9894ba8f09f4/copy/0.log" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.793500 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/must-gather-qlcqc" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.798221 4717 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-z4dql_must-gather-qlcqc_ccac2f3f-65cc-45b0-baf4-9894ba8f09f4/copy/0.log" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.798551 4717 generic.go:334] "Generic (PLEG): container finished" podID="ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" containerID="a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e" exitCode=143 Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.798568 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-z4dql/must-gather-qlcqc" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.798643 4717 scope.go:117] "RemoveContainer" containerID="a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.832974 4717 scope.go:117] "RemoveContainer" containerID="58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.847024 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-must-gather-output\") pod \"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4\" (UID: \"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4\") " Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.847311 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxg78\" (UniqueName: \"kubernetes.io/projected/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-kube-api-access-zxg78\") pod \"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4\" (UID: \"ccac2f3f-65cc-45b0-baf4-9894ba8f09f4\") " Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.856626 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-kube-api-access-zxg78" (OuterVolumeSpecName: "kube-api-access-zxg78") pod "ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" (UID: "ccac2f3f-65cc-45b0-baf4-9894ba8f09f4"). InnerVolumeSpecName "kube-api-access-zxg78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.924695 4717 scope.go:117] "RemoveContainer" containerID="a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e" Oct 07 15:30:48 crc kubenswrapper[4717]: E1007 15:30:48.927096 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e\": container with ID starting with a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e not found: ID does not exist" containerID="a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.927135 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e"} err="failed to get container status \"a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e\": rpc error: code = NotFound desc = could not find container \"a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e\": container with ID starting with a4a783cb2f81bc0bdb2e030841577e9b5964afe873d430dac6008b898c46ab9e not found: ID does not exist" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.927183 4717 scope.go:117] "RemoveContainer" containerID="58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96" Oct 07 15:30:48 crc kubenswrapper[4717]: E1007 15:30:48.931300 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96\": container with ID starting with 58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96 not found: ID does not exist" containerID="58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.931357 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96"} err="failed to get container status \"58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96\": rpc error: code = NotFound desc = could not find container \"58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96\": container with ID starting with 58b936dfb267015bf963675dc1a7b0390e40d1d203fb687c0193ee211600bd96 not found: ID does not exist" Oct 07 15:30:48 crc kubenswrapper[4717]: I1007 15:30:48.949780 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxg78\" (UniqueName: \"kubernetes.io/projected/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-kube-api-access-zxg78\") on node \"crc\" DevicePath \"\"" Oct 07 15:30:49 crc kubenswrapper[4717]: I1007 15:30:49.089064 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" (UID: "ccac2f3f-65cc-45b0-baf4-9894ba8f09f4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:30:49 crc kubenswrapper[4717]: I1007 15:30:49.153590 4717 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 07 15:30:50 crc kubenswrapper[4717]: I1007 15:30:50.878771 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" path="/var/lib/kubelet/pods/ccac2f3f-65cc-45b0-baf4-9894ba8f09f4/volumes" Oct 07 15:30:53 crc kubenswrapper[4717]: I1007 15:30:53.869275 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:30:53 crc kubenswrapper[4717]: E1007 15:30:53.870116 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:31:05 crc kubenswrapper[4717]: I1007 15:31:05.869165 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:31:05 crc kubenswrapper[4717]: E1007 15:31:05.870235 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:31:19 crc kubenswrapper[4717]: I1007 15:31:19.868601 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:31:19 crc kubenswrapper[4717]: E1007 15:31:19.869342 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.564933 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x524m"] Oct 07 15:31:30 crc kubenswrapper[4717]: E1007 15:31:30.566892 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b8d88bc-90f4-46b2-8292-ce5820e824fd" containerName="collect-profiles" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.566921 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b8d88bc-90f4-46b2-8292-ce5820e824fd" containerName="collect-profiles" Oct 07 15:31:30 crc kubenswrapper[4717]: E1007 15:31:30.566940 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" containerName="gather" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.566950 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" containerName="gather" Oct 07 15:31:30 crc kubenswrapper[4717]: E1007 15:31:30.566965 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" containerName="copy" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.566974 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" containerName="copy" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.567331 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" containerName="copy" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.567368 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b8d88bc-90f4-46b2-8292-ce5820e824fd" containerName="collect-profiles" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.567390 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccac2f3f-65cc-45b0-baf4-9894ba8f09f4" containerName="gather" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.569495 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.580829 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x524m"] Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.728290 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-utilities\") pod \"certified-operators-x524m\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.728406 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-catalog-content\") pod \"certified-operators-x524m\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.728462 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlgr8\" (UniqueName: \"kubernetes.io/projected/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-kube-api-access-rlgr8\") pod \"certified-operators-x524m\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.830592 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-utilities\") pod \"certified-operators-x524m\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.830691 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-catalog-content\") pod \"certified-operators-x524m\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.830744 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlgr8\" (UniqueName: \"kubernetes.io/projected/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-kube-api-access-rlgr8\") pod \"certified-operators-x524m\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.831608 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-utilities\") pod \"certified-operators-x524m\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.831835 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-catalog-content\") pod \"certified-operators-x524m\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.861922 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlgr8\" (UniqueName: \"kubernetes.io/projected/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-kube-api-access-rlgr8\") pod \"certified-operators-x524m\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:30 crc kubenswrapper[4717]: I1007 15:31:30.908916 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:31 crc kubenswrapper[4717]: I1007 15:31:31.599060 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x524m"] Oct 07 15:31:32 crc kubenswrapper[4717]: I1007 15:31:32.198382 4717 generic.go:334] "Generic (PLEG): container finished" podID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerID="99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211" exitCode=0 Oct 07 15:31:32 crc kubenswrapper[4717]: I1007 15:31:32.198489 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x524m" event={"ID":"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63","Type":"ContainerDied","Data":"99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211"} Oct 07 15:31:32 crc kubenswrapper[4717]: I1007 15:31:32.198645 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x524m" event={"ID":"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63","Type":"ContainerStarted","Data":"ebc9ded2e3d018e490c4db90c10966866a3a918a3c3f4855d714659e10743ae1"} Oct 07 15:31:32 crc kubenswrapper[4717]: I1007 15:31:32.200496 4717 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 07 15:31:34 crc kubenswrapper[4717]: I1007 15:31:34.215751 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x524m" event={"ID":"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63","Type":"ContainerStarted","Data":"5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac"} Oct 07 15:31:34 crc kubenswrapper[4717]: I1007 15:31:34.868532 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:31:34 crc kubenswrapper[4717]: E1007 15:31:34.869116 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:31:35 crc kubenswrapper[4717]: I1007 15:31:35.227565 4717 generic.go:334] "Generic (PLEG): container finished" podID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerID="5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac" exitCode=0 Oct 07 15:31:35 crc kubenswrapper[4717]: I1007 15:31:35.227615 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x524m" event={"ID":"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63","Type":"ContainerDied","Data":"5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac"} Oct 07 15:31:36 crc kubenswrapper[4717]: I1007 15:31:36.239709 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x524m" event={"ID":"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63","Type":"ContainerStarted","Data":"aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315"} Oct 07 15:31:36 crc kubenswrapper[4717]: I1007 15:31:36.270909 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x524m" podStartSLOduration=2.798046538 podStartE2EDuration="6.270886789s" podCreationTimestamp="2025-10-07 15:31:30 +0000 UTC" firstStartedPulling="2025-10-07 15:31:32.200145797 +0000 UTC m=+5874.028071589" lastFinishedPulling="2025-10-07 15:31:35.672986048 +0000 UTC m=+5877.500911840" observedRunningTime="2025-10-07 15:31:36.264307518 +0000 UTC m=+5878.092233330" watchObservedRunningTime="2025-10-07 15:31:36.270886789 +0000 UTC m=+5878.098812581" Oct 07 15:31:40 crc kubenswrapper[4717]: I1007 15:31:40.909875 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:40 crc kubenswrapper[4717]: I1007 15:31:40.910218 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:40 crc kubenswrapper[4717]: I1007 15:31:40.956044 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:41 crc kubenswrapper[4717]: I1007 15:31:41.331727 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:41 crc kubenswrapper[4717]: I1007 15:31:41.374125 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x524m"] Oct 07 15:31:43 crc kubenswrapper[4717]: I1007 15:31:43.295287 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x524m" podUID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerName="registry-server" containerID="cri-o://aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315" gracePeriod=2 Oct 07 15:31:43 crc kubenswrapper[4717]: I1007 15:31:43.803362 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:43 crc kubenswrapper[4717]: I1007 15:31:43.918629 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-catalog-content\") pod \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " Oct 07 15:31:43 crc kubenswrapper[4717]: I1007 15:31:43.918726 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-utilities\") pod \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " Oct 07 15:31:43 crc kubenswrapper[4717]: I1007 15:31:43.919722 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-utilities" (OuterVolumeSpecName: "utilities") pod "a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" (UID: "a4b6e93f-3e6e-4d81-a00f-81d5425e6e63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:31:43 crc kubenswrapper[4717]: I1007 15:31:43.920107 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlgr8\" (UniqueName: \"kubernetes.io/projected/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-kube-api-access-rlgr8\") pod \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\" (UID: \"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63\") " Oct 07 15:31:43 crc kubenswrapper[4717]: I1007 15:31:43.920720 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:31:43 crc kubenswrapper[4717]: I1007 15:31:43.926353 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-kube-api-access-rlgr8" (OuterVolumeSpecName: "kube-api-access-rlgr8") pod "a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" (UID: "a4b6e93f-3e6e-4d81-a00f-81d5425e6e63"). InnerVolumeSpecName "kube-api-access-rlgr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:31:43 crc kubenswrapper[4717]: I1007 15:31:43.972295 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" (UID: "a4b6e93f-3e6e-4d81-a00f-81d5425e6e63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.022404 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlgr8\" (UniqueName: \"kubernetes.io/projected/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-kube-api-access-rlgr8\") on node \"crc\" DevicePath \"\"" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.022455 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.307473 4717 generic.go:334] "Generic (PLEG): container finished" podID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerID="aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315" exitCode=0 Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.307533 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x524m" event={"ID":"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63","Type":"ContainerDied","Data":"aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315"} Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.307565 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x524m" event={"ID":"a4b6e93f-3e6e-4d81-a00f-81d5425e6e63","Type":"ContainerDied","Data":"ebc9ded2e3d018e490c4db90c10966866a3a918a3c3f4855d714659e10743ae1"} Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.307570 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x524m" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.307586 4717 scope.go:117] "RemoveContainer" containerID="aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.331810 4717 scope.go:117] "RemoveContainer" containerID="5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.353343 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x524m"] Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.363328 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x524m"] Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.391634 4717 scope.go:117] "RemoveContainer" containerID="99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.412047 4717 scope.go:117] "RemoveContainer" containerID="aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315" Oct 07 15:31:44 crc kubenswrapper[4717]: E1007 15:31:44.412715 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315\": container with ID starting with aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315 not found: ID does not exist" containerID="aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.412763 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315"} err="failed to get container status \"aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315\": rpc error: code = NotFound desc = could not find container \"aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315\": container with ID starting with aa91fcebb3c95ce448719bad1b8478600f2e4bfa49873a0fa45685359d193315 not found: ID does not exist" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.412793 4717 scope.go:117] "RemoveContainer" containerID="5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac" Oct 07 15:31:44 crc kubenswrapper[4717]: E1007 15:31:44.413246 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac\": container with ID starting with 5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac not found: ID does not exist" containerID="5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.413275 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac"} err="failed to get container status \"5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac\": rpc error: code = NotFound desc = could not find container \"5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac\": container with ID starting with 5ce3707dd08a5bbd6c602a807aea448cf78531085c43ad9ddd2cd9c5f3794dac not found: ID does not exist" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.413291 4717 scope.go:117] "RemoveContainer" containerID="99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211" Oct 07 15:31:44 crc kubenswrapper[4717]: E1007 15:31:44.413617 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211\": container with ID starting with 99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211 not found: ID does not exist" containerID="99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.413642 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211"} err="failed to get container status \"99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211\": rpc error: code = NotFound desc = could not find container \"99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211\": container with ID starting with 99ff0d5da76dc2033efdf9cae57c7167397bf278cd3126e5f799ca908f80c211 not found: ID does not exist" Oct 07 15:31:44 crc kubenswrapper[4717]: I1007 15:31:44.880247 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" path="/var/lib/kubelet/pods/a4b6e93f-3e6e-4d81-a00f-81d5425e6e63/volumes" Oct 07 15:31:48 crc kubenswrapper[4717]: I1007 15:31:48.876216 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:31:48 crc kubenswrapper[4717]: E1007 15:31:48.877126 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.010826 4717 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mn86v"] Oct 07 15:31:59 crc kubenswrapper[4717]: E1007 15:31:59.011648 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerName="registry-server" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.011659 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerName="registry-server" Oct 07 15:31:59 crc kubenswrapper[4717]: E1007 15:31:59.011691 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerName="extract-utilities" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.011697 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerName="extract-utilities" Oct 07 15:31:59 crc kubenswrapper[4717]: E1007 15:31:59.011717 4717 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerName="extract-content" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.011723 4717 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerName="extract-content" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.013484 4717 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b6e93f-3e6e-4d81-a00f-81d5425e6e63" containerName="registry-server" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.015466 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.022429 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn86v"] Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.158147 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-utilities\") pod \"redhat-marketplace-mn86v\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.158265 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-catalog-content\") pod \"redhat-marketplace-mn86v\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.158333 4717 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7qwn\" (UniqueName: \"kubernetes.io/projected/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-kube-api-access-j7qwn\") pod \"redhat-marketplace-mn86v\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.259983 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-utilities\") pod \"redhat-marketplace-mn86v\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.260125 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-catalog-content\") pod \"redhat-marketplace-mn86v\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.260177 4717 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7qwn\" (UniqueName: \"kubernetes.io/projected/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-kube-api-access-j7qwn\") pod \"redhat-marketplace-mn86v\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.260407 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-utilities\") pod \"redhat-marketplace-mn86v\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.260476 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-catalog-content\") pod \"redhat-marketplace-mn86v\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.281590 4717 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7qwn\" (UniqueName: \"kubernetes.io/projected/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-kube-api-access-j7qwn\") pod \"redhat-marketplace-mn86v\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.334147 4717 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:31:59 crc kubenswrapper[4717]: I1007 15:31:59.799337 4717 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn86v"] Oct 07 15:32:00 crc kubenswrapper[4717]: I1007 15:32:00.477855 4717 generic.go:334] "Generic (PLEG): container finished" podID="0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8" containerID="4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d" exitCode=0 Oct 07 15:32:00 crc kubenswrapper[4717]: I1007 15:32:00.477924 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn86v" event={"ID":"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8","Type":"ContainerDied","Data":"4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d"} Oct 07 15:32:00 crc kubenswrapper[4717]: I1007 15:32:00.480047 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn86v" event={"ID":"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8","Type":"ContainerStarted","Data":"92087fcf22af7197509716ec3722f9d3d7758c270a92f34ddd56908c9edf9580"} Oct 07 15:32:01 crc kubenswrapper[4717]: I1007 15:32:01.868860 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:32:01 crc kubenswrapper[4717]: E1007 15:32:01.869658 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:32:02 crc kubenswrapper[4717]: E1007 15:32:02.439903 4717 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f7c9a00_1e36_4fba_bc9b_ed7fdae085d8.slice/crio-conmon-772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de.scope\": RecentStats: unable to find data in memory cache]" Oct 07 15:32:02 crc kubenswrapper[4717]: I1007 15:32:02.498857 4717 generic.go:334] "Generic (PLEG): container finished" podID="0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8" containerID="772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de" exitCode=0 Oct 07 15:32:02 crc kubenswrapper[4717]: I1007 15:32:02.498908 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn86v" event={"ID":"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8","Type":"ContainerDied","Data":"772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de"} Oct 07 15:32:03 crc kubenswrapper[4717]: I1007 15:32:03.509470 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn86v" event={"ID":"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8","Type":"ContainerStarted","Data":"d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2"} Oct 07 15:32:03 crc kubenswrapper[4717]: I1007 15:32:03.530628 4717 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mn86v" podStartSLOduration=2.889092605 podStartE2EDuration="5.530609424s" podCreationTimestamp="2025-10-07 15:31:58 +0000 UTC" firstStartedPulling="2025-10-07 15:32:00.480883329 +0000 UTC m=+5902.308809121" lastFinishedPulling="2025-10-07 15:32:03.122400148 +0000 UTC m=+5904.950325940" observedRunningTime="2025-10-07 15:32:03.527641293 +0000 UTC m=+5905.355567105" watchObservedRunningTime="2025-10-07 15:32:03.530609424 +0000 UTC m=+5905.358535216" Oct 07 15:32:09 crc kubenswrapper[4717]: I1007 15:32:09.335197 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:32:09 crc kubenswrapper[4717]: I1007 15:32:09.335536 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:32:09 crc kubenswrapper[4717]: I1007 15:32:09.381278 4717 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:32:09 crc kubenswrapper[4717]: I1007 15:32:09.603950 4717 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:32:09 crc kubenswrapper[4717]: I1007 15:32:09.648140 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn86v"] Oct 07 15:32:11 crc kubenswrapper[4717]: I1007 15:32:11.574578 4717 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mn86v" podUID="0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8" containerName="registry-server" containerID="cri-o://d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2" gracePeriod=2 Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.552592 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.605094 4717 generic.go:334] "Generic (PLEG): container finished" podID="0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8" containerID="d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2" exitCode=0 Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.605143 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn86v" event={"ID":"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8","Type":"ContainerDied","Data":"d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2"} Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.605172 4717 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn86v" event={"ID":"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8","Type":"ContainerDied","Data":"92087fcf22af7197509716ec3722f9d3d7758c270a92f34ddd56908c9edf9580"} Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.605176 4717 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn86v" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.605190 4717 scope.go:117] "RemoveContainer" containerID="d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.626245 4717 scope.go:117] "RemoveContainer" containerID="772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.645948 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-utilities\") pod \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.646421 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-catalog-content\") pod \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.646561 4717 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7qwn\" (UniqueName: \"kubernetes.io/projected/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-kube-api-access-j7qwn\") pod \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\" (UID: \"0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8\") " Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.647120 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-utilities" (OuterVolumeSpecName: "utilities") pod "0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8" (UID: "0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.647421 4717 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-utilities\") on node \"crc\" DevicePath \"\"" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.655358 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-kube-api-access-j7qwn" (OuterVolumeSpecName: "kube-api-access-j7qwn") pod "0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8" (UID: "0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8"). InnerVolumeSpecName "kube-api-access-j7qwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.662717 4717 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8" (UID: "0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.665062 4717 scope.go:117] "RemoveContainer" containerID="4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.732193 4717 scope.go:117] "RemoveContainer" containerID="d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2" Oct 07 15:32:12 crc kubenswrapper[4717]: E1007 15:32:12.735552 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2\": container with ID starting with d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2 not found: ID does not exist" containerID="d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.735599 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2"} err="failed to get container status \"d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2\": rpc error: code = NotFound desc = could not find container \"d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2\": container with ID starting with d846bdcc0c7ce1009bd1e3b81046d9b2eb9a484e046b7dbeacdaf5630c6ba5a2 not found: ID does not exist" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.735644 4717 scope.go:117] "RemoveContainer" containerID="772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de" Oct 07 15:32:12 crc kubenswrapper[4717]: E1007 15:32:12.736445 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de\": container with ID starting with 772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de not found: ID does not exist" containerID="772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.736550 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de"} err="failed to get container status \"772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de\": rpc error: code = NotFound desc = could not find container \"772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de\": container with ID starting with 772f64068c147b56f050da7df25ffbd64f65b078c2215b667a5eb88d45b9f6de not found: ID does not exist" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.736583 4717 scope.go:117] "RemoveContainer" containerID="4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d" Oct 07 15:32:12 crc kubenswrapper[4717]: E1007 15:32:12.736871 4717 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d\": container with ID starting with 4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d not found: ID does not exist" containerID="4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.736893 4717 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d"} err="failed to get container status \"4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d\": rpc error: code = NotFound desc = could not find container \"4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d\": container with ID starting with 4cb1301cb079007a690148c37e0d35ae2927639ea5495bc52db20643211dbb0d not found: ID does not exist" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.748985 4717 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.749031 4717 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7qwn\" (UniqueName: \"kubernetes.io/projected/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8-kube-api-access-j7qwn\") on node \"crc\" DevicePath \"\"" Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.932669 4717 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn86v"] Oct 07 15:32:12 crc kubenswrapper[4717]: I1007 15:32:12.941805 4717 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn86v"] Oct 07 15:32:14 crc kubenswrapper[4717]: I1007 15:32:14.880507 4717 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8" path="/var/lib/kubelet/pods/0f7c9a00-1e36-4fba-bc9b-ed7fdae085d8/volumes" Oct 07 15:32:15 crc kubenswrapper[4717]: I1007 15:32:15.868510 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:32:15 crc kubenswrapper[4717]: E1007 15:32:15.869222 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6" Oct 07 15:32:28 crc kubenswrapper[4717]: I1007 15:32:28.875596 4717 scope.go:117] "RemoveContainer" containerID="b170b28f414e323061d633e24ab134ed38e83bd06e6daea2fc1d510515b2c88a" Oct 07 15:32:28 crc kubenswrapper[4717]: E1007 15:32:28.876380 4717 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-2f4zj_openshift-machine-config-operator(2f0e0c90-54cc-4aac-9c56-ad711d2d69a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-2f4zj" podUID="2f0e0c90-54cc-4aac-9c56-ad711d2d69a6"